robots.txt for Astro Sites
robots.txt template for Astro framework sites. Minimal configuration needed thanks to static-first architecture.
robots.txt
User-agent: * Disallow: /404 Allow: / Sitemap: https://example.com/sitemap-index.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /404 — prevents the 404 page from appearing in search results
Allow: / — all other pages are freely crawlable
Sitemap — points to the Astro-generated sitemap index
Best Practices for Astro
- ✓ Use @astrojs/sitemap integration for automatic sitemap generation.
- ✓ Astro generates static HTML by default, making it inherently SEO-friendly.
- ✓ Place robots.txt in the /public directory for static serving.
- ✓ For SSR mode, consider generating robots.txt dynamically via an API route.
Build a custom robots.txt for your Astro site
Open robots.txt GeneratorFrequently Asked Questions
Where do I put robots.txt in an Astro project?▾
Place it in the /public directory. Astro copies all public directory files to the build output root. Alternatively, use the astro-robots-txt integration.
Does Astro need a complex robots.txt?▾
No. Astro is static-first with zero client-side JavaScript by default. A minimal robots.txt is all you need. Only add rules if you have specific pages to exclude.
How do I handle robots.txt for Astro SSR?▾
For server-rendered Astro sites, you can create an API route (src/pages/robots.txt.ts) that returns robots.txt content dynamically based on the environment.
Related Templates
Gatsby robots.txt template for Gatsby static sites. Optimized for the static site generation build process. Next.js Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly. Nuxt robots.txt template for Nuxt.js (Vue.js) applications. Handles both SSR and static generation modes.