T
ToolPrime

robots.txt for Django

Python/Django robots.txt template. Blocks admin, static files directory, and internal URLs while allowing public content.

robots.txt

User-agent: *
Disallow: /admin/
Disallow: /accounts/
Disallow: /api/
Disallow: /static/admin/
Disallow: /media/private/
Allow: /static/
Allow: /media/

Sitemap: https://example.com/sitemap.xml

Line-by-Line Explanation

User-agent: * — applies to all crawlers

Disallow: /admin/ — blocks the Django admin interface

Disallow: /accounts/ — prevents authentication pages from being crawled

Disallow: /api/ — blocks REST API endpoints

Disallow: /static/admin/ — blocks Django admin static assets

Disallow: /media/private/ — blocks private uploaded files

Allow: /static/ — ensures CSS, JS, and images are accessible for rendering

Allow: /media/ — ensures public uploaded files are crawlable

Sitemap — points to Django's built-in sitemap framework output

Best Practices for Django

Build a custom robots.txt for your Django site

Open robots.txt Generator

Frequently Asked Questions

How do I serve robots.txt in Django?
You can serve it as a static file from your web server, create a Django view that returns plain text, or use the django-robots third-party package for database-managed rules.
Should I block Django REST Framework endpoints?
Yes. Block /api/ routes unless they serve public content pages. API responses are not useful in search results and waste crawl budget.
Does Django have built-in robots.txt support?
Not natively, but django.contrib.sitemaps handles sitemaps. For robots.txt, use the django-robots package or create a simple view in your urls.py.

Related Templates