Cameron FrancisJuly 17,2014

How To Easily Test Your Robots.txt Files

Share on Facebook0Tweet about this on TwitterShare on Google+0Pin on Pinterest0Share on LinkedIn0Share on StumbleUpon0

Your robots.txt file determines which portions of your site are crawled, so you have to decide upon which portions are to be crawled and which should be blocked.

Test Robots.txt

Creation and maintenance of proper robots.txt files can be difficult at times. While many sites don’t even include a robots.txt file, locating the directives inside of a robots.txt file that’s large that are blocking crawling of URLs can be a bit difficult. To make this process easier, Google is now providing a robots.txt testing tool which has recently been updated within Google Webmaster Tools.

You can find the updated testing tool in Webmaster Tools within the Crawl section:

You will find your most recent robots.txt file in this section and it enables you to choose URLs to determine if they are able to be crawled. This tool also allows you to change your robots.txt will and test it. However you still must upload your new version to your server after testing for your changes to take effect. Google includes more information on robots.txt directives and the way they’re processed in their developer’s site.

In addition, you have the ability to view older robots.txt versions and determine when Google’s crawlers were block from crawling portions of your site. For Instance, if Google detects a server error related to your robots.txt file, they won’t crawl your site.

It could be that there are some warnings or errors displayed for your site. Consequently Google recommends that you check your robots.txt files. You can also use this check in conjunction with other Google Webmaster tools, like using their “Fetch as Google” tool to display important page sites. If they tool reports blocked URLs, you can perform a robots.txt test to locate the directive that is causing them to be blocked, and remedy the situation. It is common for older robots.txt files to block JavaScript, CSS, and mobile content. Fixing this situation is often fairly easy after you have found these issues.
Google’s updated robots.txt testing tool should make it faster and more straightforward to maintain your robots.txt files.

Share on Facebook0Tweet about this on TwitterShare on Google+0Pin on Pinterest0Share on LinkedIn0Share on StumbleUpon0
Author: Cameron Francis Cameron Francis is the Director of eTraffic Group. He has been engaged in all aspects of online marketing for the past 8 years. He is actively involved in SEO, Paid Search, Social Media Optimisation, and Web Design.

• Social Networks •

• Our Locations •

Click To View Our Contact Details

Level 1, 530 Little Collins Street, Melbourne VIC 3000, Australia | 1300 887 151

Level 2, 50 York Street, Sydney 2000 | 1300 788 679

Level 1, The Realm, 18 National Circuit, Barton Canberra ACT 2600 Australia | 1300 765 708

Level 1, 16 McDougall Street, Milton 4064 | 1300 765 709

220 Varsity Parade, Varsity Lakes QLD Australia 4227 | 1300 887 804

Level 1, Paspalis Centrepoint, 48-50 Smith Street, Darwin NT 0800 Australia | 1300 889 815

Level 18, Central Park. 152-158 St Georges Terrace Perth, WA 6000 Australia | 1300 550 753

Level 3, 97 Pirie Street, Adelaide 5000 | 1300 669 895

Level 6 Reserve Bank Building, 111 Macquarie Street, Hobart TAS 7000, Australia | 1300 885 870

We seek to create long-term relationships
Tell us about your business goals and we will contact you