Troubleshooting
Crawl Failures
Understand why crawls fail and how to fix common issues.
Updated December 9, 2025
Why Crawls Fail
Firewall Blocking
Site firewall blocks our crawler:
- Add our IP to whitelist
- Disable bot protection temporarily
- Check with hosting provider
Robots.txt Restrictions
Site blocks crawlers in robots.txt:
- Check yoursite.com/robots.txt
- Look for
Disallow: / - Add exception for our bot:
User-agent: LinkHealthBot Allow: /
Site Requires Login
Pages need authentication:
- We can't crawl password-protected areas
- Make pages public temporarily
- Or remove from crawl scope
Timeout Errors
Site too slow to respond:
- Check server performance
- Optimize slow pages
- Contact hosting provider
Retrying Failed Crawls
- Click on failed crawl
- Check error message
- Fix underlying issue
- Click Retry Crawl
Retrying uses a crawl credit. Fix the issue first!
Getting Refunds
Crawls qualify for refund if:
- Blocked by firewall
- Found 3 or fewer links
- Obvious technical issue
Contact support for refund request.