Sitemaps

From Wikipedia, the free encyclopedia

The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol, and complement robots.txt, a URL exclusion protocol.

Sitemaps are particularly beneficial in situations

  • When users cannot access all areas of a website through a browse able interface. In these cases, a search engine can't find these pages. For example, a site with a large "archive" or "database" of resources that aren't well linked to each other (if at all), only accessible via a search form.
  • Where webmasters use rich Ajax or Flash, and search engines can't navigate through to get to the content.

The webmaster can generate a sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a sitemap would let the biggest search engines have the updated pages information.

Sitemaps supplement and do not replace the existing crawl-based mechanisms that search engines already use to discover URLs. By submitting Sitemaps to a search engine a webmaster is only helping that engine's crawlers to do a better job of crawling their site(s). Using this protocol does not guarantee that your webpages will be included in search indexes nor does it influence the way that pages are ranked by a search engine.

Contents

[edit] History of Sitemaps

  • Google first introduced Sitemaps 0.84 in June 2005 so web developers could publish lists of links from across their sites.

The Sitemaps protocol is based on ideas[1]from "Crawler-friendly Web Servers".[2]

[edit] XML Sitemap Format

The Sitemap Protocol format consists of XML tags. The file itself must be UTF-8 encoded. (Sitemaps can also be just a plain text list of URLs. They can also be compressed in .gz format.)

[edit] Sample

A sample Sitemap that contains just one URL and uses all optional tags is shown below.

<urlset
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
                            http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
        <url>
                <loc>http://w3c-at.de</loc>
                <lastmod>2006-11-18</lastmod>
                <changefreq>daily</changefreq>
                <priority>0.8</priority>
        </url>
</urlset>

[edit] Submitting Sitemaps

If Sitemaps are submitted directly to a search engine (pinged), it will return status information and any processing errors. Refer to the appropriate search engine for information on monitoring the automatic submissions.

Also, the location of the Sitemap can be specified using a robots.txt file to help search engines find the Sitemaps. To do this, the following lines need to be added to robots.txt:

Sitemap: <sitemap_location>

The <sitemap_location> should be the complete URL to the Sitemap, such as: http://www.example.org/sitemap.xml

This directive is independent of the user-agent line, so it doesn't matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don't need to list each individual Sitemap listed in the index file.

Search engine Submission URL Help page
Google http://google.com/webmasters/sitemaps/ping?sitemap= How do I resubmit my Sitemap once it has changed?
Yahoo! http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=SitemapWriter&url=
http://search.yahooapis.com/SiteExplorerService/V1/ping?sitemap=
Does Yahoo! support Sitemaps?
Ask.com http://submissions.ask.com/ping?sitemap= Q: Does Ask.com support sitemaps?
Moreover.com http://api.moreover.com/ping?u=
Live Search http://webmaster.live.com/ping.aspx?siteMap= Webmaster Tools (beta)


[edit] Sitemap limits

Sitemap files have a limit of 50,000 URLs and 10 MegaBytes per sitemap. Sitemaps can be compressed using gzip, reducing bandwidth consumption. Multiple sitemap files are supported, with a Sitemap index file serving as an entry point for a total of 1000 sitemaps.

As with all XML files, any data values (including URLs) must use entity escape codes for the characters : ampersand(&), single quote ('), double quote ("), less than (<) and greater than (>).

[edit] Notes

  1. ^ M.L. Nelson, J.A. Smith, del Campo, H. Van de Sompel, X. Liu (2006). "Efficient, Automated Web Resource Harvesting". WIDM'06. 
  2. ^ O. Brandman, J. Cho, Hector Garcia-Molina, and Narayanan Shivakumar (2000). "Crawler-friendly web servers". Proceedings of ACM SIGMETRICS Performance Evaluation Review, Volume 28, Issue 2. doi:10.1145/362883.362894. 

[edit] See also

  • Biositemap, a protocol for broadcasting and disseminating information about computational biology resources (data, software tools and web-services)
  • Metadata
  • Resources of a Resource - ROR
  • Site map, a graphical representation of the architecture of a web site
  • Sitemap index - XML file that lists the multiple XML sitemap files

[edit] External links