robots.txt for Drupal
Drupal-optimized robots.txt template. Blocks admin, system files, and internal paths while allowing content pages.
robots.txt
User-agent: * Disallow: /admin/ Disallow: /user/ Disallow: /core/ Disallow: /modules/ Disallow: /themes/ Disallow: /profiles/ Disallow: /install.php Disallow: /update.php Disallow: /cron.php Disallow: /xmlrpc.php Disallow: /filter/tips Disallow: /node/add/ Disallow: /search/ Disallow: /comment/reply/ Allow: /core/*.css$ Allow: /core/*.js$ Allow: /modules/*.css$ Allow: /modules/*.js$ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /admin/, /user/ — blocks admin and user profile pages
Disallow: /core/, /modules/, /themes/, /profiles/ — blocks Drupal system directories
Disallow: /*.php files — blocks direct access to PHP scripts
Disallow: /filter/tips, /node/add/, /search/ — blocks internal Drupal paths
Disallow: /comment/reply/ — prevents comment form pages from being indexed
Allow: CSS and JS from core and modules — essential for page rendering
Sitemap — directs crawlers to the XML sitemap
Best Practices for Drupal
- ✓ Drupal ships with a default robots.txt — customize it for your specific needs.
- ✓ Use the XML Sitemap module for automatic sitemap generation.
- ✓ Never block CSS and JS files that search engines need to render pages.
- ✓ Add specific paths for custom content types you want to exclude.
Build a custom robots.txt for your Drupal site
Open robots.txt GeneratorFrequently Asked Questions
Does Drupal include a robots.txt by default?▾
Yes. Drupal ships with a default robots.txt in the project root. You can customize it directly or use the RobotsTxt module for database-managed rules.
Should I block /node/ paths in Drupal?▾
If you use path aliases (recommended), the raw /node/123 paths may create duplicate content. Consider blocking /node/ and ensuring all content has clean URL aliases.
How do I handle multilingual robots.txt in Drupal?▾
robots.txt is language-independent. Use hreflang tags and separate sitemaps per language instead. The robots.txt should allow all language paths.
Related Templates
WordPress Optimized robots.txt template for WordPress sites. Blocks admin, login, and plugin directories while allowing important content to be crawled. Laravel Secure robots.txt template for Laravel applications. Blocks framework internals, API endpoints, and admin routes. Django Python/Django robots.txt template. Blocks admin, static files directory, and internal URLs while allowing public content.