T
ToolPrime

robots.txt for Drupal

Drupal-optimized robots.txt template. Blocks admin, system files, and internal paths while allowing content pages.

robots.txt

User-agent: *
Disallow: /admin/
Disallow: /user/
Disallow: /core/
Disallow: /modules/
Disallow: /themes/
Disallow: /profiles/
Disallow: /install.php
Disallow: /update.php
Disallow: /cron.php
Disallow: /xmlrpc.php
Disallow: /filter/tips
Disallow: /node/add/
Disallow: /search/
Disallow: /comment/reply/
Allow: /core/*.css$
Allow: /core/*.js$
Allow: /modules/*.css$
Allow: /modules/*.js$

Sitemap: https://example.com/sitemap.xml

Line-by-Line Explanation

User-agent: * — applies to all crawlers

Disallow: /admin/, /user/ — blocks admin and user profile pages

Disallow: /core/, /modules/, /themes/, /profiles/ — blocks Drupal system directories

Disallow: /*.php files — blocks direct access to PHP scripts

Disallow: /filter/tips, /node/add/, /search/ — blocks internal Drupal paths

Disallow: /comment/reply/ — prevents comment form pages from being indexed

Allow: CSS and JS from core and modules — essential for page rendering

Sitemap — directs crawlers to the XML sitemap

Best Practices for Drupal

Build a custom robots.txt for your Drupal site

Open robots.txt Generator

Frequently Asked Questions

Does Drupal include a robots.txt by default?
Yes. Drupal ships with a default robots.txt in the project root. You can customize it directly or use the RobotsTxt module for database-managed rules.
Should I block /node/ paths in Drupal?
If you use path aliases (recommended), the raw /node/123 paths may create duplicate content. Consider blocking /node/ and ensuring all content has clean URL aliases.
How do I handle multilingual robots.txt in Drupal?
robots.txt is language-independent. Use hreflang tags and separate sitemaps per language instead. The robots.txt should allow all language paths.

Related Templates