T
ToolPrime

robots.txt for React SPAs

robots.txt template for client-side React applications. Handles build artifacts and ensures proper crawling of SPA routes.

robots.txt

User-agent: *
Disallow: /static/js/*.map
Disallow: /static/css/*.map
Allow: /

Sitemap: https://example.com/sitemap.xml

Line-by-Line Explanation

User-agent: * — applies to all crawlers

Disallow: /static/js/*.map — blocks JavaScript source maps from being indexed

Disallow: /static/css/*.map — blocks CSS source maps from being indexed

Allow: / — explicitly allows all routes to be crawled

Sitemap — essential for SPAs since crawlers may not discover all routes through links

Best Practices for React (Single Page Application)

Build a custom robots.txt for your React (Single Page Application) site

Open robots.txt Generator

Frequently Asked Questions

Can Google crawl React SPAs?
Google can render JavaScript, but it is slower and less reliable than static HTML. For best SEO results, use SSR (Next.js), SSG, or pre-rendering alongside a properly configured robots.txt and sitemap.
Where do I put robots.txt in a React app?
Place it in the /public directory. Create React App and Vite both serve files from /public at the root URL automatically.
Do SPAs need a sitemap?
Yes, absolutely. Since SPA routes are handled client-side, crawlers cannot discover them by following links. A sitemap.xml is the only reliable way to tell search engines about all your routes.

Related Templates