Working with Sitemap.xml and Robots.txt

When it comes to on-page search engine optimization, the importance of sitemap & robot files can’t be ignored. These files work as a guiding lighthouse for search engine crawlers. In this blog post, I am going to share with you some reasons and ways of using sitemap.xml & robots.txt on your web server.

Sitemap.xml

sitemap

A sitemap is a file that sits at the root of a web server and guides search engine crawlers to discover the pages on your website. Having this file on your website is a must-have in order to make it easier for search engine crawlers to index your website and all those webpages which you want to appear on search engines.

The location of a sitemap file on the website should be something like http://websitename.com/sitemap.xml

Creating a sitemap file is pretty easy even for a layman who doesn’t have any technical know-how. One can simply go to the website below and enter the web address and it will automatically create a sitemap.xml file for you.

http://xml-sitemaps.com (XML Sitemap Generator)

Once you have the sitemap file ready for your website, you can simply ask your web designer to upload that file to the root of your web server. After uploading the file to your server, you will need to submit this sitemap file to the Google Webmaster Tools account for your website. You can read further instructions to add the sitemap to Google Webmaster Tools here https://support.google.com/sites/answer/100283?hl=en

Robots.txt

robots

A Robots.txt file acts conversely to that of a sitemap.xml file. Whereas, a sitemap.xml file helps search engine crawlers to discover those pages which need to be indexed & shown on search result pages; a robots.txt file would inform crawlers about those webpages which shouldn’t be indexed & shown on search results.

The location of a robots.txt file on the website should be something like http://websitename.com/robots.txt and the code inside this file looks something like the below:

robots

In order to create a robots.txt file, you need to write some special scripts which should be done with the help of a website designer or SEO specialist because if a robots.txt file is not created properly, it can ruin a website’s presence on search engines. Although there are lots of plugins and modules available if your website has been built using Open Sources like WordPress, Joomla, or Drupal; but still is recommended that you must seek advice from your SEO consultant before you make any changes to it.