Overview
Shopify stores can control bot access through the robots.txt file. This guide explains how to configure your Shopify store to allow specific bots to crawl your content.
Understanding Shopify's robots.txt System
Shopify uses a theme-based approach to robots.txt management. Unlike other platforms, you'll edit a template file in your theme rather than a standalone robots.txt file.
Accessing robots.txt in Shopify
1. Log into Your Shopify Admin
Navigate to your Shopify admin panel at yourstore.myshopify.com/admin.
2. Go to Online Store
In the left sidebar, click on "Online Store" to access your theme settings.
3. Access Theme Code
Click on "Themes" and find your current active theme. Click the "Actions" dropdown button, then select "Edit code."
4. Locate robots.txt.liquid
In the code editor, look in the "Templates" folder on the left sidebar. Find and click on robots.txt.liquid. If this file doesn't exist, you may need to create it.
Creating or Editing robots.txt.liquid
Creating the File (if needed)
If robots.txt.liquid doesn't exist:
1. In the "Templates" section, click "Add a new template"
2. Select "robots" from the dropdown
3. Click "Create template"
Adding Bot Permissions
Once you have the file open, you can add your user agent rules.
Example: Allowing ChatRankBot
To allow ChatRankBot to crawl your Shopify store, add the following at the top of your robots.txt.liquid file:
liquid
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /Complete Example Configuration
Here's a full robots.txt.liquid example with multiple bots and Shopify's default protections:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: Googlebot
Allow: /
# Shopify default rules
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /*/collections/*sort_by*
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /*/collections/*+*
Disallow: /*/collections/*%2B*
Disallow: /*/collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/blogs/*+*
Disallow: /*/blogs/*%2B*
Disallow: /*/blogs/*%2b*
Disallow: /*?*oseid=*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Sitemap: https://{{ shop.domain }}/sitemap.xmlPreserving Shopify's Default Rules
Shopify includes default rules to protect sensitive areas. Always maintain these protections for security:
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /accountSaving and Publishing Changes
1. After editing the robots.txt.liquid file, click "Save" in the top-right corner
2. Changes take effect immediately
3. No need to publish or deploy—the file is live as soon as you save
Avoiding Rate Limiting During Scraping
E-commerce sites like Shopify stores require careful rate limit management to ensure bot traffic doesn't impact customer experience or trigger platform protections.
Configure Crawl-delay
Add a crawl-delay directive to specify how long bots should wait between requests:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
Crawl-delay: 3Shopify-Specific Rate Limiting Considerations
Shopify's Infrastructure Protection: Shopify has built-in rate limiting to protect stores. Aggressive bot crawling can trigger these protections, potentially blocking the bot or throttling requests.
Recommended Crawl Delays by Store Size:
- Small stores (< 100 products): 2-3 seconds
- Medium stores (100-1000 products): 3-5 seconds
- Large stores (1000+ products): 5-10 seconds
Consider Your Shopify Plan: Different plans have different resource allocations:
- Basic Shopify: Use crawl-delay of 5-10 seconds
- Shopify: Use crawl-delay of 3-5 seconds
- Advanced Shopify: Use crawl-delay of 2-4 seconds
- Shopify Plus: Use crawl-delay of 1-3 seconds
Set Different Delays for Different Bots
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /products/
Allow: /collections/
Allow: /pages/
Allow: /blogs/
Crawl-delay: 3
User-agent: GPTBot
Allow: /
Crawl-delay: 5
User-agent: Googlebot
Allow: /
Crawl-delay: 2
User-agent: *
Allow: /
Crawl-delay: 10Monitor Store Performance
Track bot impact using Shopify analytics:
Go to Analytics → Reports in your Shopify admin
Check "Online store sessions by traffic source"
Look for unusual bot traffic patterns
Review page load times during peak bot activity
Protect High-Load Pages
Certain pages generate more server load. Consider restricting or adding delays for these:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /products/
Allow: /collections/
Allow: /blogs/
Disallow: /search
Disallow: /cart
Crawl-delay: 3Use Shopify's Built-in Bot Protection
Shopify automatically protects against malicious bots. Your robots.txt configuration works alongside these protections.
Peak Hour Considerations
E-commerce stores have peak shopping hours. Consider requesting that bot operators:
- Crawl during off-peak hours (typically late night/early morning in your timezone)
- Avoid crawling during promotional events or sales
- Reduce crawl rate during high-traffic periods
Contact bot operators via their user agent URL (like https://chatrank.ai/bot for ChatRankBot) to coordinate crawl schedules.
Monitor for Bot-Triggered Rate Limits
If you notice:
- Increased 429 (Too Many Requests) errors in your logs
- Legitimate bot traffic being blocked
- Store performance degradation during bot crawls
Take these steps:
Increase crawl-delay values in robots.txt
Contact the bot operator to request slower crawling
Review Shopify analytics for bot impact
Consider temporarily blocking problematic bots while resolving issues
Allowing Bots to Access Specific Content
Products Only
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /products/
Disallow: /collections/
Disallow: /pages/
Crawl-delay: 3Blog Content Only
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /blogs/
Disallow: /products/
Disallow: /collections/
Crawl-delay: 2Everything Except Checkout
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
Disallow: /checkout
Disallow: /cart
Disallow: /account
Crawl-delay: 4Using Liquid Variables
You can use Shopify's Liquid templating in your robots.txt file:
Sitemap: https://{{ shop.domain }}/sitemap.xmlThis dynamically references your store's domain.
Complete Example with Rate Limiting
Here's a comprehensive robots.txt.liquid configuration optimized for Shopify:
# ChatRankBot with moderate delay for e-commerce content
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /products/
Allow: /collections/
Allow: /pages/
Allow: /blogs/
Disallow: /search
Crawl-delay: 3
# AI crawlers with longer delay
User-agent: ChatGPT-User
Allow: /
Crawl-delay: 5
User-agent: GPTBot
Allow: /
Crawl-delay: 5
User-agent: ClaudeBot
Allow: /
Crawl-delay: 5
# Search engines with shorter delay (well-optimized crawlers)
User-agent: Googlebot
Allow: /
Crawl-delay: 2
User-agent: Bingbot
Allow: /
Crawl-delay: 2
# Shopping comparison bots with moderate delay
User-agent: PriceGrabber
Allow: /products/
Crawl-delay: 5
User-agent: Shopzilla
Allow: /products/
Crawl-delay: 5
# Default for all bots with conservative delay
User-agent: *
Allow: /
Crawl-delay: 10
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /*/collections/*sort_by*
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /*/collections/*+*
Disallow: /*/collections/*%2B*
Disallow: /*/collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/blogs/*+*
Disallow: /*/blogs/*%2B*
Disallow: /*/blogs/*%2b*
Disallow: /*?*oseid=*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Disallow: /search
Sitemap: https://{{ shop.domain }}/sitemap.xmlVerifying Your Configuration
Test your robots.txt file:
Visit
https://yourstore.com/robots.txtin a browserConfirm your custom rules appear correctly
Check that Shopify's default protections are still in place
Use Google Search Console's robots.txt tester for validation
Common Bot User Agents for E-commerce
- ChatRankBot/1.0 (+https://chatrank.ai/bot) (ChatRank crawler)
- ChatGPT-User (ChatGPT browsing)
- GPTBot (OpenAI crawler)
- Googlebot (Google Search)
- Bingbot (Bing Search)
- PriceGrabber (Price comparison)
- Shopzilla (Shopping search engine)
- Slurp (Yahoo Search)
Advanced Rate Limiting Strategies
Coordinate with Bot Operators
For stores with large inventories or high traffic:
Contact bot operators via their user agent URL
Request scheduled crawls during specific time windows
Negotiate crawl rates that work for both parties
Set up monitoring to ensure agreed-upon rates are maintained
Monitor Shopify Analytics
Regularly check:
- Analytics → Reports → Online store sessions: Track bot traffic
- Analytics → Reports → Top online store pages: See which pages bots access most
- Store speed reports to identify bot impact on performance
Adjust During High-Traffic Events
Before major sales or product launches:
Temporarily increase crawl-delay values
Notify important bot operators of blackout periods
Monitor store performance during events
Restore normal crawl-delay after events conclude
Troubleshooting
If bots aren't accessing your site as expected:
- Verify the robots.txt.liquid file saved correctly
- Check for syntax errors (proper spacing and formatting)
- Clear your browser cache and check the live robots.txt URL
- Wait 24-48 hours for bots to recrawl your robots.txt file
- Ensure your Shopify store isn't password-protected (Settings → Online Store → Preferences)
If bots are causing performance issues:
- Increase crawl-delay values significantly (10+ seconds)
- Contact bot operators to request slower crawling
- Check Shopify status page for platform-wide issues
- Review your Shopify plan and consider upgrading if consistently hitting limits
- Temporarily block problematic bots while resolving issues
- Contact Shopify support if bot traffic is impacting store performance
Security Best Practice
Always maintain Shopify's default disallow rules for admin, checkout, cart, and account areas to protect customer data and prevent unauthorized access to sensitive store functions. Never remove these protections:
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /accountAdditional Considerations
Password-Protected Stores: If your store is password-protected during development, bots cannot access it regardless of robots.txt configuration. Remove password protection before expecting bot traffic.
Shopify Plus Features: Shopify Plus stores can implement additional bot management through their dedicated support team and advanced configurations.
International Stores: If you operate multiple international stores, configure robots.txt.liquid separately for each store based on regional traffic patterns and bot behavior.


We’ve been using ChatRank for 34 days, and following their plan, we’ve actually grown over 30% in search visibility

ChatRank helped us go from zero visibility to ranking #2 in a core prompt for our business with only one new blog post!

