Crawler Logs
Monitor which bots visit your site, which pages they crawl and how often
Crawler Logs
Crawler Logs automatically records every bot and crawler visit to your site, giving you full visibility into how search engines crawl your content. Available on the Mid Plan.
What is Recorded
Each log entry contains:
| Field | Description |
|---|---|
| **Bot / Crawler** | Identified name (Googlebot, Bingbot, DuckDuckBot, etc.) |
| **Visited URL** | Specific page the bot crawled |
| **Timestamp** | Exact date and time of the visit |
| **Full User Agent** | Complete crawler agent string |
| **Crawl Frequency** | How regularly it visits that URL |
Use Cases
Verify Google is Indexing
Check that Googlebot visits your new and updated pages after publishing them. If a page does not appear in the logs after 2–3 weeks, there may be an indexing issue.
Identify Most-Crawled Pages
Discover what content attracts the most attention from crawlers — generally the most valuable or most linked content on your site.
Detect Malicious Bots
Identify unknown User Agents or bots with unusual crawl patterns that may be scraping your content.
Crawl Timing
Knowing when Googlebot passes through lets you schedule important publications just before its usual visits.
Debug Indexing Issues
If a page is not appearing in Google, the logs tell you whether the bot is visiting but not indexing it — or not visiting at all.
Database
Logs are stored in the wp_baseo_crawler_logs table in your WordPress database. Data persists between sessions and is queryable with filters by bot, URL and date range.