Overview
This guide explains how to configure your Framer website to allow specific bots to access and scrape your content by modifying your robots.txt file.
Understanding User Agents
Bots identify themselves using a "user agent" string when they visit your website. By configuring your robots.txt file, you can control which bots can access your site and which pages they can visit.
Steps to Configure Bot Access in Framer
Access Your Site Settings: Log into your Framer project and navigate to the site you want to configure. Click on the site settings icon in the top-right corner of the editor.
Locate SEO Settings: In the site settings panel, scroll down to find the "SEO & Social" section. This is where you'll manage your robots.txt configuration.
Edit robots.txt: Framer allows you to customize your robots.txt file directly through the settings panel. Click on "Advanced SEO" or "robots.txt" depending on your Framer plan.
Add User Agent Rules: To allow a specific bot, add the following syntax to your robots.txt:
User-agent: BotName
Allow: /Replace "BotName" with the actual user agent string of the bot you want to allow.
Example: Allowing ChatRankBot
To allow ChatRankBot to crawl your Framer site, add the following to your robots.txt:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /This configuration tells ChatRankBot that it has permission to access all pages on your site.
Complete Example Configuration
Here's a full robots.txt example with multiple bots:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: GPTBot
Allow: /
User-agent: Googlebot
Allow: /
User-agent: *
Disallow: /admin/Publish Your Changes: After configuring your robots.txt file, click "Save" or "Update" and then publish your site for the changes to take effect. Changes typically propagate within a few minutes.
Avoiding Rate Limiting During Scraping
Rate limiting protects your Framer site from being overwhelmed by too many requests. To ensure smooth bot access without triggering rate limits:
Configure Crawl-delay
Add a crawl-delay directive to your robots.txt to specify how long bots should wait between requests:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
Crawl-delay: 2This tells the bot to wait 2 seconds between requests. Adjust this value based on your site's capacity.
Best Practices for Rate Limit Avoidance
Set Reasonable Crawl Delays: For most Framer sites, a crawl-delay of 1-5 seconds is appropriate. Larger sites with more resources can handle shorter delays.
Monitor Your Site Performance: Check your Framer site analytics to see if bot traffic is impacting performance. Increase crawl-delay if you notice issues.
Use Different Delays for Different Bots: You can set different crawl-delay values for different user agents:
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
Crawl-delay: 2
User-agent: GPTBot
Allow: /
Crawl-delay: 5
User-agent: Googlebot
Allow: /
Crawl-delay: 1Contact the Bot Operator
If you're experiencing rate limiting issues with a specific bot, reach out to the bot operator. Most reputable bots provide contact information in their user agent string (like the URL in ChatRankBot's user agent). They can often adjust their crawling behavior on request.
Framer-Specific Considerations
Framer sites are hosted on Framer's infrastructure, which has its own rate limiting protections. If you notice aggressive bot behavior:
- Increase crawl-delay values in robots.txt
- Contact Framer support if a bot is causing performance issues
- Consider temporarily blocking problematic bots while you resolve the issue
Alternative: Allow Only During Off-Peak Hours
While robots.txt doesn't support time-based rules, you can coordinate with bot operators to schedule crawls during your site's off-peak hours to minimize impact on real users.
Verifying Your Configuration
You can verify your robots.txt file by visiting https://yoursite.com/robots.txt in a web browser. Make sure your rules appear correctly formatted.
Common Bot User Agents
- ChatRankBot/1.0 (+https://chatrank.ai/bot) (ChatRank crawler)
- ChatGPT-User (ChatGPT web browsing)
- GPTBot (OpenAI's crawler)
- GoogleBot (Google Search)
- Bingbot (Bing Search)
- Slurp (Yahoo Search)
## Additional Considerations
Keep in mind that robots.txt is a voluntary protocol. While reputable bots will respect your configuration, malicious scrapers may ignore it. If you need to block specific bots, you can use Disallow: / instead of Allow: / for their user agent.
## Complete Example with Rate Limiting
Here's a comprehensive robots.txt configuration for a Framer site:
# Allow ChatRankBot with moderate crawl delay
User-agent: ChatRankBot/1.0 (+https://chatrank.ai/bot)
Allow: /
Crawl-delay: 2
# Allow AI crawlers with slightly longer delay
User-agent: ChatGPT-User
Allow: /
Crawl-delay: 3
User-agent: GPTBot
Allow: /
Crawl-delay: 3
# Allow search engines with minimal delay
User-agent: Googlebot
Allow: /
Crawl-delay: 1
User-agent: Bingbot
Allow: /
Crawl-delay: 1
# Default for all other bots
User-agent: *
Allow: /
Crawl-delay: 5
Disallow: /admin/This configuration ensures that your site remains accessible to important bots while protecting against excessive crawling that could impact site performance.


We’ve been using ChatRank for 34 days, and following their plan, we’ve actually grown over 30% in search visibility

ChatRank helped us go from zero visibility to ranking #2 in a core prompt for our business with only one new blog post!

