Robots.txt Manager
Edit and validate your robots.txt directly from the WordPress dashboard
Robots.txt Manager
The Robots.txt Manager lets you edit and maintain your robots.txt file directly from the WordPress admin panel, without needing FTP or server access. Available on the Basic Plan.
What is robots.txt?
The robots.txt file is the first file read by Google, Bing and other search engine crawlers when they visit your site. It controls which parts of your site can or cannot be indexed.
An error in robots.txt can block your entire site from search results — which is why it is critical to have an editor with built-in validation.
Features
- Direct editor — Edit robots.txt from WordPress Admin, no FTP needed
- Syntax validation — Detects errors before saving
- Full directive support:
| Directive | Function |
|---|---|
| `User-agent` | Specifies which bot the rule applies to |
| `Disallow` | Blocks access to specific paths |
| `Allow` | Explicitly allows access to blocked sub-paths |
| `Crawl-delay` | Sets the wait time between requests |
| `Sitemap` | Indicates the XML sitemap URL |
Example of a Correct robots.txt
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
User-agent: Googlebot
Allow: /
Sitemap: https://yoursite.com/sitemap.xmlProtection Against Errors
The validator automatically detects configurations that could block the entire site, such as:
User-agent: *
Disallow: / ← BLOCKS THE ENTIRE SITE> Tip: After making changes, use Google Search Console → robots.txt Tester tool to verify that Google bots can access your important pages.