Your robots.txt file determines which portions of your site are crawled, so you have to decide upon which portions are to be crawled and which should be blocked.
Creation and maintenance of proper robots.txt files can be difficult at times. While many sites don’t even include a robots.txt file, locating the directives inside of a robots.txt file that’s large that are blocking crawling of URLs can be a bit difficult. To make this process easier, Google is now providing a robots.txt testing tool which has recently been updated within Google Webmaster Tools.
You can find the updated testing tool in Webmaster Tools within the Crawl section:
You will find your most recent robots.txt file in this section and it enables you to choose URLs to determine if they are able to be crawled. This tool also allows you to change your robots.txt will and test it. However you still must upload your new version to your server after testing for your changes to take effect. Google includes more information on robots.txt directives and the way they’re processed in their developer’s site.
In addition, you have the ability to view older robots.txt versions and determine when Google’s crawlers were block from crawling portions of your site. For Instance, if Google detects a server error related to your robots.txt file, they won’t crawl your site.
Google’s updated robots.txt testing tool should make it faster and more straightforward to maintain your robots.txt files.