T
ToolPrime

robots.txt for Next.js

Production-ready robots.txt template for Next.js applications. Handles API routes, internal pages, and build artifacts properly.

robots.txt

User-agent: *
Disallow: /api/
Disallow: /_next/
Disallow: /404
Disallow: /500
Allow: /_next/static/

Sitemap: https://example.com/sitemap.xml

Line-by-Line Explanation

User-agent: * — applies to all crawlers

Disallow: /api/ — blocks API route handlers from being crawled

Disallow: /_next/ — blocks Next.js build artifacts and server chunks

Disallow: /404 — prevents the 404 page from being indexed

Disallow: /500 — prevents the 500 error page from being indexed

Allow: /_next/static/ — ensures static assets (JS, CSS, images) are accessible for rendering

Sitemap — directs crawlers to the sitemap for full page discovery

Best Practices for Next.js

Build a custom robots.txt for your Next.js site

Open robots.txt Generator

Frequently Asked Questions

How do I create robots.txt in Next.js?
Place a robots.txt file in the /public directory for static generation. In Next.js 13+ App Router, create a robots.ts file in the app directory that exports a metadata function.
Should I block /_next/ in Next.js?
Block /_next/ but allow /_next/static/ so search engines can access CSS and JavaScript needed to render your pages. Blocking all static assets hurts your SEO.
Do I need a separate robots.txt for preview deployments?
Yes. Preview and staging deployments should use noindex meta tags and a restrictive robots.txt (Disallow: /) to prevent search engines from indexing unfinished work.

Related Templates