Google, Yahoo! & MSN all to support the Sitemaps protocol
Friday, 17 November 2006
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata.
Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.
Notice that last sentence, it is something worth bearing in mind, the Sitemap protocol does not guarantee inclusion, so please don't be confused, like many other people are, into thinking Sitemaps is a complete optimisation solution - it's not.
Actually I have never been overly enthusiastic about Sitemaps, simply because Sitemaps only drops URL's into the crawler queues. To put this into perspective if you had serious enough problems that the links could not be crawled in the first place, then you are probably better off addressing those issues first before trying to load URL's that would effectively remain unlinked from other locations, and thus have a greatly impeded or no chance of ranking well in the results pages.
However it is good to see the search engines working together on industry standards.
For more information visit http://www.sitemaps.org
Comments