robots.txt Checker
Fetch, parse, and validate any website's robots.txt file. Check if you're blocking Googlebot, find missing sitemaps, and view crawl rules for every user agent.
What does this tool check?
Googlebot & Bingbot Access
Instantly flags if your robots.txt accidentally blocks Google or Bing crawlers, which can remove your site from search results.
Crawl Rules by Agent
Parses all User-agent blocks and shows every Allow, Disallow, and Crawl-delay directive in a readable format.
Sitemap Detection
Lists all Sitemap URLs declared in robots.txt so you can verify search engines can find your sitemap.
Raw File View
Shows the complete raw robots.txt content so you can spot formatting issues, comments, or unexpected directives.
Frequently Asked Questions
What is robots.txt?
robots.txt is a text file placed at the root of a website (e.g., https://example.com/robots.txt) that tells search engine crawlers which pages or sections they are allowed or not allowed to crawl and index.
How do I stop Google from indexing my site?
Add the following to your robots.txt file: User-agent: * Disallow: / This will block all crawlers from all pages. For Googlebot specifically, use "User-agent: Googlebot" instead of "*". Note: robots.txt blocks crawling, but already-indexed pages may still appear in search results until they are recrawled.
Does robots.txt affect SEO?
Yes. Incorrectly configured robots.txt can prevent Google from crawling important pages, hurting your search rankings. Common mistakes include accidentally blocking Googlebot, blocking CSS/JS files (which prevents Google from rendering your pages), or forgetting to declare your sitemap.
What is Disallow: / in robots.txt?
"Disallow: /" means the entire website is blocked for that user-agent. If applied to "User-agent: *", no search engine crawler can access any page on the site. This is one of the most common SEO mistakes — often introduced accidentally during development or migration.
How do I add a sitemap to robots.txt?
Add a Sitemap directive at the end of your robots.txt file: Sitemap: https://example.com/sitemap.xml You can include multiple sitemap lines if you have more than one. This helps search engines find all your pages more efficiently.
Monitor your robots.txt for changes
Uptrue's robots.txt change monitoring alerts you the moment a deploy modifies your crawl rules. Pair it with sitemap validity monitoring so a broken sitemap reference never silently sinks your indexation.
Start Monitoring Free