robots.txt for Next.js
Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly.
robots.txt
User-agent: * Disallow: /api/ Disallow: /_next/ Disallow: /404 Disallow: /500 Allow: /_next/static/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /api/ — blocks API route handlers from being crawled
Disallow: /_next/ — blocks Next.js build artifacts and server chunks
Disallow: /404 — prevents the 404 page from being indexed
Disallow: /500 — prevents the 500 error page from being indexed
Allow: /_next/static/ — ensures static assets (JS, CSS, images) are accessible for rendering
Sitemap — directs crawlers to the sitemap for full page discovery
Best Practices for Next.js
- ✓ Use next-sitemap package to auto-generate both sitemap.xml and robots.txt.
- ✓ In Next.js 13+, you can create robots.ts in the app directory for dynamic generation.
- ✓ Block /api/ routes unless they serve public content like RSS feeds.
- ✓ For ISR pages, ensure revalidation endpoints are not blocked.
Build a custom robots.txt for your Next.js site
Open robots.txt GeneratorFrequently Asked Questions
How do I create robots.txt in Next.js?▾
Place a robots.txt file in the /public directory for static generation. In Next.js 13+ App Router, create a robots.ts file in the app directory that exports a metadata function.
Should I block /_next/ in Next.js?▾
Block /_next/ but allow /_next/static/ so search engines can access CSS and JavaScript needed to render your pages. Blocking all static assets hurts your SEO.
Do I need a separate robots.txt for preview deployments?▾
Yes. Preview and staging deployments should use noindex meta tags and a restrictive robots.txt (Disallow: /) to prevent search engines from indexing unfinished work.
Related Templates
React (Single Page Application) robots.txt template for client-side React applications. Handles build artifacts and ensures proper crawling of SPA routes. Nuxt robots.txt template for Nuxt.js (Vue.js) applications. Handles both SSR and static generation modes. Gatsby robots.txt template for Gatsby static sites. Optimized for the static site generation build process.