robots.txt for Squarespace
robots.txt guide for Squarespace sites. Understand the auto-generated rules and available customization options.
robots.txt
User-agent: * Disallow: /config Disallow: /search Disallow: /account Disallow: /api/ Disallow: /static/ Allow: /static/images/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /config — blocks site configuration pages
Disallow: /search — prevents search result pages from being indexed
Disallow: /account — blocks customer account pages
Disallow: /api/ — blocks internal API endpoints
Disallow: /static/ — blocks static asset directory
Allow: /static/images/ — ensures uploaded images are crawlable
Sitemap — points to the Squarespace-generated sitemap
Best Practices for Squarespace
- ✓ Squarespace auto-generates robots.txt — limited customization is available.
- ✓ Use the SEO panel in page settings to set noindex on specific pages.
- ✓ Squarespace handles sitemaps automatically and submits them to search engines.
- ✓ Focus on on-page SEO since robots.txt options are limited on Squarespace.
Build a custom robots.txt for your Squarespace site
Open robots.txt GeneratorFrequently Asked Questions
Can I customize robots.txt on Squarespace?▾
Squarespace has limited robots.txt customization. You can disable search engine indexing for the entire site in Settings > SEO, but per-path control requires using noindex meta tags on individual pages.
Does Squarespace handle SEO well?▾
Squarespace generates clean HTML, auto-creates sitemaps, and handles canonical URLs. For most small business sites, the auto-generated robots.txt and SEO features are sufficient.
How do I block specific Squarespace pages from Google?▾
Use the SEO toggle in page settings to disable indexing for specific pages. This adds a noindex meta tag, which is more effective than robots.txt for preventing indexing.
Related Templates
Wix Recommended robots.txt configuration for Wix websites. Understand what Wix generates and how to customize it. Shopify Recommended robots.txt configuration for Shopify stores. Handles collection filtering, checkout, and admin pages. WordPress Optimized robots.txt template for WordPress sites. Blocks admin, login, and plugin directories while allowing important content to be crawled.