robots.txt for Shopify
Recommended robots.txt configuration for Shopify stores. Handles collection filtering, checkout, and admin pages.
robots.txt
User-agent: * Disallow: /admin Disallow: /cart Disallow: /orders Disallow: /checkouts/ Disallow: /checkout Disallow: /carts Disallow: /account Disallow: /collections/*sort_by* Disallow: /*/collections/*sort_by* Disallow: /blogs/*+* Disallow: /blogs/*tagged* Disallow: /*?*variant=* Disallow: /*?*q=* Disallow: /*?*sort_by* Allow: /collections/ Allow: /products/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /admin — blocks the Shopify admin panel
Disallow: /cart, /orders, /checkouts/ — blocks shopping cart and checkout pages
Disallow: /account — prevents customer account pages from being crawled
Disallow: /collections/*sort_by* — blocks sorted collection variants (duplicate content)
Disallow: /blogs/*tagged* — prevents tag filter pages from being indexed
Disallow: /*?*variant=* — blocks individual product variant URLs
Disallow: /*?*q=* — blocks search result pages
Allow: /collections/ and /products/ — ensures product and collection pages are crawlable
Sitemap — points to the auto-generated Shopify sitemap
Best Practices for Shopify
- ✓ Shopify auto-generates a robots.txt — use the robots.txt.liquid template to customize it.
- ✓ Block sorted and filtered collection URLs to prevent massive duplicate content.
- ✓ Ensure product variant URLs are handled through canonical tags in addition to robots.txt.
- ✓ Check Shopify's documentation for the latest robots.txt.liquid syntax.
Build a custom robots.txt for your Shopify site
Open robots.txt Generator