robots.txt for Laravel
Secure robots.txt template for Laravel applications. Blocks framework internals, API endpoints, and admin routes.
robots.txt
User-agent: * Disallow: /admin/ Disallow: /api/ Disallow: /storage/ Disallow: /vendor/ Disallow: /nova/ Disallow: /horizon/ Disallow: /telescope/ Disallow: /login Disallow: /register Disallow: /password/ Allow: /storage/app/public/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /admin/ — blocks admin panel routes
Disallow: /api/ — prevents API endpoints from being crawled
Disallow: /storage/ — blocks the storage directory
Disallow: /vendor/ — prevents Composer dependency files from being indexed
Disallow: /nova/, /horizon/, /telescope/ — blocks Laravel admin tools
Disallow: /login, /register, /password/ — blocks authentication pages
Allow: /storage/app/public/ — ensures publicly uploaded files are accessible
Sitemap — directs crawlers to the XML sitemap
Best Practices for Laravel
- ✓ Place robots.txt in the /public directory where Laravel serves static files.
- ✓ If using Laravel Nova, Horizon, or Telescope, always block their routes.
- ✓ Consider generating robots.txt dynamically via a route for environment-specific rules.
- ✓ Block any custom admin or dashboard routes specific to your application.
Build a custom robots.txt for your Laravel site
Open robots.txt GeneratorFrequently Asked Questions
Where do I put robots.txt in Laravel?▾
Place it in the /public directory at the root of your Laravel project. This is where the web server serves static files from.
Should I block /vendor/ in Laravel?▾
Yes. The /vendor directory contains Composer dependencies and should never be publicly accessible. This is also a security best practice.
Can I generate robots.txt dynamically in Laravel?▾
Yes. Create a route that returns a plain text response with the robots.txt content. This lets you use different rules for production vs staging environments.
Related Templates
Django Python/Django robots.txt template. Blocks admin, static files directory, and internal URLs while allowing public content. Next.js Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly. Drupal Drupal-optimized robots.txt template. Blocks admin, system files, and internal paths while allowing content pages.