Friday, November 17th, 2006

ACAP: Making Ajax Search Friendly

Category: Editorial

Peter Illes has posted on handling search crawlers within an Ajax application.

He discusses ACAP (Automated Content Access Protocol), a new initiative from the international publishing community to turn the challenges facing the industry from web technologies (especially search) into opportunities in a win-win way, and as a side effect can help Web Applications out, too.

The balance is that we have the tools to do things such as robots.txt, hiding content via JavaScript, showing different content to search crawlers etc, but the crawlers aren’t keen on some of these techniques. If they find out that you are serving up different content to them, you can be in trouble, even if you are doing it with good intentions.

This entry came just before the Sitemaps initiative of Google, Yahoo, Microsoft, and hopefully others (e.g. Ask) in the future. Sitemaps is obviously good in theory. What will be interesting to SEO folks will be to see how it really affects things. Proof is in the pudding, so if people see that the engines are really using this data, then people will pick it up more and more.

Posted by Dion Almaer at 9:01 am
Comment here

+++--
3.4 rating from 14 votes

Comments Here »

Comments feed TrackBack URI

Leave a comment

You must be logged in to post a comment.