robots.txt for Magento
E-commerce robots.txt template for Magento stores. Handles layered navigation, checkout, and admin pages.
robots.txt
User-agent: * Disallow: /admin/ Disallow: /checkout/ Disallow: /customer/ Disallow: /catalogsearch/ Disallow: /wishlist/ Disallow: /sendfriend/ Disallow: /review/product/list/ Disallow: /*?*SID= Disallow: /*?*___store= Disallow: /*?*___from_store= Disallow: /*?*dir=* Disallow: /*?*order=* Disallow: /*?*limit=* Disallow: /*?*mode=* Disallow: /*?*p=* Allow: /media/ Allow: /static/ Sitemap: https://example.com/sitemap.xml
Line-by-Line Explanation
User-agent: * — applies to all crawlers
Disallow: /admin/ — blocks the Magento admin panel
Disallow: /checkout/, /customer/ — blocks checkout and account pages
Disallow: /catalogsearch/ — prevents search result pages from being indexed
Disallow: /wishlist/, /sendfriend/ — blocks user action pages
Disallow: /review/product/list/ — blocks review listing pages
Disallow: session and store parameters — prevents duplicate URLs with query strings
Disallow: sorting and pagination parameters — prevents crawling of filtered/sorted duplicates
Allow: /media/ and /static/ — ensures images, CSS, and JS are accessible
Sitemap — points to the Magento-generated sitemap
Best Practices for Magento / Adobe Commerce
- ✓ Magento 2 can generate robots.txt from the admin panel: Stores > Configuration > Design.
- ✓ Block all layered navigation parameters to prevent exponential duplicate content.
- ✓ Use canonical tags in addition to robots.txt for product and category pages.
- ✓ Ensure the sitemap is generated with correct base URLs.
Build a custom robots.txt for your Magento / Adobe Commerce site
Open robots.txt Generator