Robots.txt Generator Tool

This free online automatic Robots.txt Generator tool help webmasters to generate robots.txt files very easily and quickly for any websites or blogs. To use this Robots.txt Generator tool, pls. enter valid information carefully (as per instruction given) into the text box below, and then click on the below button "Create Robots.txt" or "Create and Save as Robots.txt" or "Clear" according to your choice.

Default - All Robots are:  
Sitemap: (leave blank if you don't have) 
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo MM
  Yahoo Blogs
  DMOZ Checker
  MSN PicSearch
Restricted Directories: The path is relative to root and must contain a trailing slash "/"

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

About Robots.txt Generator Tool

Free Online Robots.txt Generator Tool is 100% free SEO tool that easily and quickly create robots.txt files for your website or blog. With the help of this generator tool, you can either open and edit an existing file or create a new one as per your need. Here, you have the option to easily pick which types of search robot or crawlers to allow or disallow. You can also set crawl delay time with only a few clicks. If you are looking to use a robot.txt file in your website or blog then this seo tool will solve your problem very easily.

For this, just select your crawling preferences as per instruction given with this tool and generate a fully optimized robots.txt file. You can easily set up any directive you want and generate a text file that you can use right away in your website or blog to improve your SEO.

It is basically a text file present within your website or blog that allows or disallows Google and other search engines from accessing your website or webpages. It is a set of instruction that tells the search engines what and how to index. It also improves your site performance in search engines. With the help of this robot.txt file, you can point out to search engines crawlers to crawl your most important webpages, to ignore duplicate pages and avoid to index certain specific content for your website or blog in search engines.