Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

At first, if you want to use a robots.txt generator or at least you want to know about how it works, you need to know about robots.txt file. So, here you will get a piece of detail information about what is robots.txt file, how it works and obviously why do you need robots.txt.

xIn general, webmasters or website admins create and use robots.txt and it is a simple text file, to instruct web robots or web crawlers mostly in search engine robots (GoogleBot for Google, Slurp for Yahoo and many more) how to crawl pages on their website.  

This robots.txt file can be placed in the root folder of your website and with the help of this search engines may help you to index your website more accurately. Basically, search engines including Google, Bing, Yahoo and others, use web crawlers or robots that review all the content on your site. But, if you want that these robots don’t crawl some parts of your website, you can do that by using a robots.txt file and web crawler will ignore those pages on your site. Technically, robots.txt file is the part of the robots exclusion protocol known as REP, a group of web standards that manage how robots crawl the web, access and index files & contents and serve that content up to clients.  

Well, tinyseotool.com has a built in robots.txt generator for you and you can generate this file for your website by clicking above mentioned fields.

Such as, you will be able to choose all user agent or specific user agent which is actually a specific web crawler or robot to which you are giving crawl instructions (usually a search engine) by selecting the options named Allowed or Refused, you can set a timer delay from the Crawl-delay option, you can use XML sitemap file and you can define the path of your restrictive directory.

And, then just click the Create Robots.txt and it will generate robots.txt file for you into the box. You need to copy those text and paste in a text file.

For your reminder, web crawler or robots will only look for the robots.txt file in the main directory or root folder, so create robots.txt file in your root domain or homepage otherwise web crawler will assume that this website does not have any robots.txt and it will crawl the entire pages of that site.

Last but not least, keep in mind robots.txt is case sensitive so when placing this file in your web directory don’t use ROBOTS.txt, Robots.txt or any other case.

If you have any inquiry you can contact us via the Contact US page of this site.