Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.
Robots.txt Generator generates a file that is very much opposite of the sitemap which indicates the pages to be included, therefore, robots.txt syntax is of great significance for any website. Whenever a search engine crawls any website, it always first looks for the robots.txt file that is located at the domain root level. When identified, crawler will read the file, and then identify the files and directories that may be blocked.
It is a very useful tool that has made the lives of many webmasters easier by helping them making their websites Googlebot friendly. It is a robot.txt file generator tool that can generate the required file by performing the difficult task within no time and for absolutely free. Our tool comes with a user-friendly interface that offers you the options to include or exclude the things in the robots.txt file.
Using our amazing tool, you can generate robots.txt file for your website by following these few easy and simple steps:
- By default, all robots are allowed to access your site’s files, you can choose the robots you want to allow or refuse the access.
- Choose crawl-delay which tells how much delay should be there in the crawls, allowing you to choose between your preferred delay duration from 5 to 120 seconds. It is set to ‘no delay’ by default.
- If there already is a sitemap for your website, you can paste it in the text box. On the other hand, you can leave it blank, if you don’t have.
- List of search robots is given, you can select the ones you want to crawl your site and you can refuse the robots you don’t want to crawl your files.
- Last step is to restrict directories. The path must contain a trailing slash "/", as the path is relative to root.
- At the end, when you are done generating Googlebot friendly robots.txt file with the help of our Robots .txt Generator Tool, you can now upload it to the root directory of the website.
If you wish to explore our friendly tool before using it then feel free to play with it and generate a robot.txt example.