The Sitemaps protocol permits a webmaster to tell serps about URLs on a web site which can be accessible for crawling. A Sitemap is an XML file that lists the URLs for a website. It permits site owners to incorporate further details about every URL: when it was final up to date, how typically it adjustments, and the way vital it’s in relation to different URLs of the location. This enables serps to crawl the location extra effectively and to seek out URLs that could be remoted from the remainder of the location’s content material. The Sitemaps protocol is a URL inclusion protocol and enhances robots.txt, a URL exclusion protocol.