JOHLEM

security tools & cheatsheets
← All Tools

🤖 Robots.txt Analyzer

Parse and analyze robots.txt files for SEO and crawl configuration.

Robots.txt Directives

User-agent

Specifies which crawler the rules apply to. Use * for all bots.

Disallow

Blocks access to specified paths. Empty value allows all.

Allow

Explicitly allows paths (useful for exceptions within disallowed directories).

Sitemap

Points to XML sitemap location. Can have multiple entries.

Crawl-delay

Seconds to wait between requests. Not universally supported.

Wildcards

* matches any sequence, $ matches end of URL.