robots.txt for Angular Applications
robots.txt template for Angular apps. Handles build output, source maps, and ensures proper SPA crawling.
robots.txt
User-agent: * Disallow: /*.map$ Disallow: /assets/config/ Allow: / Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /*.map$ — blocks all source map files
Disallow: /assets/config/ — blocks configuration files that may contain environment details
Allow: / — allows all application routes to be crawled
Sitemap — critical for Angular SPAs to ensure all routes are discoverable
Best Practices for Angular
- ✓ Place robots.txt in the src/ directory and add it to angular.json assets.
- ✓ Use Angular Universal for server-side rendering to improve crawlability.
- ✓ Pre-render important routes using Angular's prerender builder.
- ✓ Ensure your sitemap includes all dynamically routed pages.
Build a custom robots.txt for your Angular site
Open robots.txt GeneratorFrequently Asked Questions
Where do I put robots.txt in an Angular project?▾
Place it in the src/ folder and ensure it is listed in the assets array in angular.json. It will be copied to the build output root during compilation.
Does Angular need server-side rendering for SEO?▾
For best SEO results, yes. Angular Universal provides SSR capabilities. Without SSR, Google can still render Angular apps, but with delays and potential issues.
Should I block Angular build chunks?▾
No. Do not block JavaScript chunk files as Google needs them to render your app. Only block source maps (.map files) which are used for debugging.
Related Templates
React (Single Page Application) robots.txt template for client-side React applications. Handles build artifacts and ensures proper crawling of SPA routes. Nuxt robots.txt template for Nuxt.js (Vue.js) applications. Handles both SSR and static generation modes. Next.js Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly.