Use this tool to create a robots.txt file for your website. Fill in the rules below, then copy or download the file and upload it to your website's root directory (e.g., https://yoursite.com/robots.txt).
Quick Presets
Crawler Rules
Global Settings
Your robots.txt
Upload this file to your website root at https://yoursite.com/robots.txt
What Is robots.txt?
The robots.txt file is a plain text file placed in the root of your website. It tells search engine crawlers (like Googlebot) which pages or folders they are allowed or not allowed to crawl and index.
Important robots.txt Rules
User-agent: * — applies the rule to all search engine bots
Disallow: /path/ — blocks crawlers from accessing that path
Allow: /path/ — explicitly allows access (useful to override a Disallow)
Sitemap: — tells crawlers where to find your sitemap file
Crawl-delay: — sets how many seconds a bot should wait between requests
Note: robots.txt is a suggestion, not a guarantee. Malicious bots may ignore it. Also, blocking a page with robots.txt does not remove it from Google's index — use a noindex meta tag for that.