Overview
Crawls are automated scans that discover and analyze every link on your domain. Start crawls manually when you need immediate results, or schedule them to run automatically.Starting a Manual Crawl
Configure Options
Select your scan options:
- SSL Check
- SEO Analysis
- Anchor Text Analysis
- Response Trends
Crawl Progress
While a crawl is running, the progress view shows:| Metric | Description |
|---|---|
| URLs Discovered | Total unique URLs found |
| URLs Checked | URLs fully processed |
| Queue Size | URLs waiting to be checked |
| Errors Found | Broken links detected so far |
| Elapsed Time | How long the crawl has been running |
Live Feed
The activity feed shows URLs being processed in real-time:Scheduling Crawls
Automated crawls run on a schedule without manual intervention.Setting Up a Schedule
Select Frequency
Choose how often to run crawls:
| Plan | Available Frequencies |
|---|---|
| Free | Manual only |
| Solo | Weekly |
| Pro | Daily, Weekly |
| Agency | Hourly, Daily, Weekly |
Configure Time
Select when crawls should run:
- Time: Hour of day (in your timezone)
- Day: Day of week (for weekly)
Schedule Options
| Setting | Description |
|---|---|
| Frequency | How often crawls run |
| Time | Preferred hour to start |
| Day | For weekly: which day |
| Notify on Complete | Email when crawl finishes |
| Notify on New Issues | Email only if new problems found |
Daily Crawl Limits
Each plan has limits on how many crawls you can run per day:| Plan | Daily Crawl Limit |
|---|---|
| Free | 2 crawls |
| Solo | 5 crawls |
| Pro | 10 crawls |
| Agency | 25 crawls |
Scheduled crawls count toward your daily limit. Plan schedules accordingly.
Crawl Depth
Crawl depth determines how many “links away” from the start URL the crawler will go:| Depth | Description | Example |
|---|---|---|
| 1 | Start page only | Homepage |
| 2 | Start + direct links | Homepage + linked pages |
| 3 | Two levels of following | Most small sites |
| 5 | Medium depth | Medium sites |
| 10 | Deep crawl | Large sites |
| Unlimited | Follow all links | Complete site audit |
Depth by Plan
| Plan | Maximum Depth |
|---|---|
| Free | 2 levels |
| Solo | 5 levels |
| Pro | 10 levels |
| Agency | Unlimited |
URL Limits
Each crawl has a maximum number of URLs it will check:| Plan | Max URLs per Crawl |
|---|---|
| Free | 100 |
| Solo | 1,000 |
| Pro | 10,000 |
| Agency | Unlimited |
- Crawl completes with partial results
- Dashboard shows “URL limit reached”
- Results still include all checked URLs
Canceling a Crawl
To stop a running crawl:- Go to the crawl progress page
- Click Cancel Crawl
- Confirm cancellation
- Stops immediately
- Keeps results for URLs already checked
- Counts toward daily limit
- Shows status as “Cancelled”
Crawl Best Practices
Start Small
Begin with limited depth to gauge site size before running deep crawls.
Off-Peak Hours
Schedule crawls during low-traffic periods to minimize server impact.
Regular Cadence
Weekly crawls catch issues quickly without excessive server load.
After Deployments
Run a crawl after major changes to catch new broken links.
Crawl Queue
When multiple crawls are requested, they queue in order:- One crawl runs at a time per account
- Queued crawls start automatically when previous completes
- You can reorder or cancel queued crawls
Handling Large Sites
For sites with 10,000+ pages:Use Section Crawling
Use Section Crawling
Instead of crawling the entire site:
- Set start URL to a section (e.g.,
/blog) - Crawl each section separately
- Combine results for full picture
Sitemap-Based Crawling
Sitemap-Based Crawling
Use your sitemap as the URL source:
- Set start URL to
/sitemap.xml - Crawler extracts all URLs from sitemap
- More efficient than discovery-based crawling
Incremental Crawling (Agency)
Incremental Crawling (Agency)
Only check pages that changed:
- Enable incremental mode in settings
- Crawls check new/changed pages
- Significantly faster for large sites