Saturday, July 30th, 2005

Deep Linking Continued: Backbase Article

Category: Articles

<>p> We’ve linked to Backbase before; those guys have some awesome Ajax demos. One of brains behind the site, Jeremy Hartley, has recently written an article on the subject of “Deep Linking”; or, in other words, how to get spiders to properly index an Ajax page.

The article identifies three different strategies:

Backbase has identified the following strategies for getting a SPI indexed by search engines:

  • Lightweight Indexing: no structurally changes are made to your site; existing tags such as meta, title and h1 are leveraged.
  • Extra Link Strategy: extra links are placed on the site, which search bots can follow and thereby index the whole site.
  • Secondary Site Strategy: a secondary site is created, which is fully accessible to the search engine.

and goes into some detail on how to implement them. Interesting… though ironically, I couldn’t find a way to link directly to the article in question, so you’ll have to chose it manually from the “Technical Articles” page (though I can give you a direct link to the PDF version).

Related Content:

Posted by Ben Galbraith at 10:05 am
7 Comments

+++--
3.3 rating from 21 votes

7 Comments »

Comments feed

Who are these clowns?

Two things:

One, all this stuff has already been discussed regarding Flash for years, and I thought that an approximate consensus had been reached about where “navigation” (and the domain of search engines) ended and “richness” began. I see no reason why Ajax should provide an impediment to search engines, because it makes absolutely no sense to use spider-opaque Ajax wackiness for navigation. And if you must, there are plenty of techniques such as behaviour.js that cleanly separate the markup of your site from the enhanced javascript interaction.

Two, I don’t have much faith in a group whose site does UA sniffing and bounces me to an “incompatible browser” page when I’m running Safari 2.0. What is this, 1998? Geez.

Comment by Michal Migurski — July 30, 2005

My initial reaction to this article is it is extremely irresponsible of them to recommend black-hat SEO techniques as a way of achieving accessibility.

On the one hand, accessibility is an important area of discussion and I know at my company we spend a lot of time discussing section 508 issues and WAI.

The main ‘defense’ against search engines they seem to be recommending is the host detection / site duplication technique. This is way over in the realm of frowned upon black-hat techniques and google is known to look for this, and could possibly result in SE blacklisting or other punishments.

Furthermore I am disappointed that this firm has not got better collaboration with SEO experts and possibly the search engine representatives themselves to make official recommendations on the subject that are truly in line with SEO “best practices”

The final disclaimer at the end is really weak. It includes no discussion about what techniques they have recommended are the ones that Google might find to be unethical.. only that some of them may be unethical.

Comment by Alexei — July 30, 2005

Some small comments from the Backbase camp:

# To Ben:
A direct link to the article is:
http://www.backbase.com/index.php?loc=content/dev/tech/001_designing_rias_for_sea.xml

# To Michal:
Search engines can’t find the content you load via AJAX (XMLHttpRequest), because the piece of content doesn’t have its own URL.
Safari 2.0: I agree Safari 2.0 is a good browser, but there are still several limitations in Safari that make it very difficult for our AJAX engine to work properly. We’re working with Apple to solve this as soon as possible.

# To Alexei:
This article is not primarily about SEO or Section 508 Accessibility but about SEA: ‘search engine accessibility’. This means exposing the correct content to search engines such as Google. We are using this approach on our own site, and I can guarantee you that it’s not ‘black-hat’: we just ensure that the search engine finds the content that is also shown to human users, and that the links indexed by the search engine trigger the correct state in an AJAX interface. Having a special site for search engines is only frowned upon if you show different content to search engines.
A future article will discuss section 508 and WAI accessibility: this is a topic that we think is very important, and benefits from a dedicated article.

Comment by Jep Castelein — August 1, 2005

Jep – why isn’t the content addressable by URL? Isn’t it a better idea to design your information for addressability, instead of plastering over poor choices with SEO voodoo? Search engines aren’t the only ones impacted here – social bookmarking services such as Del.icio.us also rely on URL’s. If I can’t bookmark something, odds are I’ll forget about it pretty quickly.

Comment by Michal Migurski — August 1, 2005

I dunno about what you guys think, but from a green horn to the SEO practice it would seem a bit strange that one has to go into so much trouble just to make sure that the site is fully indexed by SE spiders. Just a creation of a Sitemap and some navigational menus along would go a long way to help sites get fully indexed. You don’t really need to venture into some of these dodgy practices just to address some of these issues. It really isn’t worth the risk, my opinion anyway.

Cheers.

Comment by Kelly — July 23, 2006

wow, good idea…. i have alarge website and i cant get indexed. i might buy a shitty .com and place all the links right on that..

Comment by kirra — October 29, 2007

I believe that there are other methods that can be properly used to have search engines properly index Ajax pages now. I have heard that google can now even spider links on using javascript or even flash.

Comment by Loans — April 26, 2008

Leave a comment

You must be logged in to post a comment.