robots.txt for Django
Python/Django robots.txt template. Blocks admin, static files directory, and internal URLs while allowing public content.
robots.txt
User-agent: * Disallow: /admin/ Disallow: /accounts/ Disallow: /api/ Disallow: /static/admin/ Disallow: /media/private/ Allow: /static/ Allow: /media/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /admin/ — blocks the Django admin interface
Disallow: /accounts/ — prevents authentication pages from being crawled
Disallow: /api/ — blocks REST API endpoints
Disallow: /static/admin/ — blocks Django admin static assets
Disallow: /media/private/ — blocks private uploaded files
Allow: /static/ — ensures CSS, JS, and images are accessible for rendering
Allow: /media/ — ensures public uploaded files are crawlable
Sitemap — points to Django's built-in sitemap framework output
Best Practices for Django
- ✓ Use Django's built-in sitemap framework (django.contrib.sitemaps) for automatic sitemap generation.
- ✓ Serve robots.txt from a Django view or as a static file in your project root.
- ✓ Block /accounts/ if using django-allauth or similar authentication packages.
- ✓ Use django-robots package for database-managed robots.txt rules.
Build a custom robots.txt for your Django site
Open robots.txt GeneratorFrequently Asked Questions
How do I serve robots.txt in Django?▾
You can serve it as a static file from your web server, create a Django view that returns plain text, or use the django-robots third-party package for database-managed rules.
Should I block Django REST Framework endpoints?▾
Yes. Block /api/ routes unless they serve public content pages. API responses are not useful in search results and waste crawl budget.
Does Django have built-in robots.txt support?▾
Not natively, but django.contrib.sitemaps handles sitemaps. For robots.txt, use the django-robots package or create a simple view in your urls.py.
Related Templates
Laravel Secure robots.txt template for Laravel applications. Blocks framework internals, API endpoints, and admin routes. Next.js Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly. React (Single Page Application) robots.txt template for client-side React applications. Handles build artifacts and ensures proper crawling of SPA routes.