robots.txt
Fetch and parse robots.txt for any website. Analyze crawl rules, user-agent directives, sitemaps, and detect SEO configuration issues.
Run a check to see results
API
POST /api/v1/web/robots5(1 vote)
What is robots.txt?
The robots.txt checker fetches and parses the robots.txt file for any website. It analyzes crawl directives, user-agent rules, disallow patterns, and sitemap references. Understanding robots.txt is critical for SEO because it controls how search engine crawlers access and index your website content.