T
ToolPrime

robots.txt for Squarespace

robots.txt guide for Squarespace sites. Understand the auto-generated rules and available customization options.

robots.txt

User-agent: *
Disallow: /config
Disallow: /search
Disallow: /account
Disallow: /api/
Disallow: /static/
Allow: /static/images/

Sitemap: https://example.com/sitemap.xml

Line-by-Line Explanation

User-agent: * — applies to all crawlers

Disallow: /config — blocks site configuration pages

Disallow: /search — prevents search result pages from being indexed

Disallow: /account — blocks customer account pages

Disallow: /api/ — blocks internal API endpoints

Disallow: /static/ — blocks static asset directory

Allow: /static/images/ — ensures uploaded images are crawlable

Sitemap — points to the Squarespace-generated sitemap

Best Practices for Squarespace

Build a custom robots.txt for your Squarespace site

Open robots.txt Generator

Frequently Asked Questions

Can I customize robots.txt on Squarespace?
Squarespace has limited robots.txt customization. You can disable search engine indexing for the entire site in Settings > SEO, but per-path control requires using noindex meta tags on individual pages.
Does Squarespace handle SEO well?
Squarespace generates clean HTML, auto-creates sitemaps, and handles canonical URLs. For most small business sites, the auto-generated robots.txt and SEO features are sufficient.
How do I block specific Squarespace pages from Google?
Use the SEO toggle in page settings to disable indexing for specific pages. This adds a noindex meta tag, which is more effective than robots.txt for preventing indexing.

Related Templates