What is XML Sitemaps

An XML Sitemap Image

What is XML sitemaps In SEO?

An XML sitemaps is a document that helps Google and other major search engines better understand your website while crawling it. Crawling in SEO is the acquisition of data about a website. Crawling is a process by which search engines crawler/ spiders/bots scan a website and collect details about each page: titles, images, keywords, other linked pages, etc. XML sitemaps is like you tell Google on how, when and where to crawl in your website. It is your gateway to google database. It has a big factor on your site Indexation.

But probably the most common misconception is that just by creating the XML sitemaps help get your pages indexed. The first thing we’ve got to get straight is this: Google does not index your pages just because you asked nicely. Google indexes pages because (a) they found them and crawled them, and (b) they consider them good enough quality to be worth indexing. Pointing Google at a page and asking them to index it doesn’t really factor into it.

It is important to note that by submitting an XML sitemap to Google Search Console, you’re giving Google a clue that you consider the pages in the XML sitemaps to be good-quality search landing pages, worthy of indexation.

How do I generate an XML sitemaps?

If your website is running on WordPress, creating XML sitemaps can be done using plugins, one that I recommend is the Premium SEO Pack. It will create XML sitemaps automatically for your website and other great features. You can configure which pages to include and to exclude in your sitemap. You can submit also your sitemap to and Bing Google with just a click of a button. It also has a file manager that can be used to edit your Robots.txt

Premium SEO Pack Plugin XML Sitemap feature

If your site is not on WordPress, just visit this link and create an XML sitemaps and upload it in your site root directory using an FTP apps or by using your hosting control panel File Manager.

Robots.txt

Robots.txt is a text file which allows a website to provide instructions to web crawling bots. Search engines like Google use these web crawlers, sometimes called web robots, to archive and categorize websites. It is important to note that not all bots will honor a robots.txt file.

A robots.txt file can have a large effect on how search engines crawl your website. This text file is not required, but does provide instructions to search engines on how to crawl the site, and is supported by all major search engines. However, this protocol is purely advisory and can be ignored by web crawling bots if they so choose.

XML Sitemaps and Robots.txt Are they somehow related?

XML sitemaps is where you declare the most important page/place in your on where the search engines crawl, while robotx.txt is a text file stored in the root directory of your site where you can control crawler or search engine bots behavior crawling your site by giving Allow / Disallow syntax. By using robotx.txt, you can also ignore/allow bots to access folders inside your root directory.

Robots txt File Example

Click this Robots.txt link for more information.

LEAVE A REPLY

Please enter your comment!
Please enter your name here