Robots.txt Analyzer
Enter a domain to fetch and parse its robots.txt file. The tool breaks down the rules by user agent, showing allowed and disallowed paths, crawl delay settings, and sitemap references. Useful for SEO auditing and verifying that search engines can access the pages you want indexed.
What this tool checks
- Per-user-agent rule breakdown
- Allow and Disallow directive listing
- Crawl-delay detection
- Sitemap URL extraction
Automate this with the API
Run this tool programmatically from your code. Get a free temporary API key with 20 requests/day — or register for 75 requests/day.
curl https://apixies.io/api/v1/robots-txt?domain=... \
-H "X-API-Key: YOUR_API_KEY"
Frequently asked questions
Does robots.txt actually block crawlers?
Should I block /admin in robots.txt?
Related tools
Free SSL Certificate Checker
Enter a domain to inspect its SSL/TLS certificate. You'll see the issuer, validity dates, days until expiry, protocol version, and whether the certificate chain is healthy. Useful for catching expiring certificates before they cause browser warnings.
Security Headers Checker
Paste a URL to analyze its HTTP security headers. The tool checks for Content-Security-Policy, Strict-Transport-Security, X-Frame-Options, and other headers that protect against common web attacks. You'll get a grade and a list of missing protections.
Free Email Validator
Enter an email address to validate it. The tool checks format syntax, resolves MX records to verify the domain accepts mail, detects disposable email services (like Mailinator), and flags role-based addresses (like info@ or admin@). Useful for cleaning mailing lists or validating form submissions.
Email Authentication Checker (SPF, DKIM, DMARC)
Enter a domain to check its email authentication configuration. The tool validates SPF records (who can send on your behalf), DKIM records (email signatures), and DMARC policies (what to do with unauthenticated mail). Misconfigured authentication is the top reason emails land in spam.