# To customize, create a robots.txt in your document root # Limit aggressive scrapers, SEO tools, and AI trainers User-agent: AdsBot-Google User-agent: AhrefsBot User-agent: Aliyun User-agent: AliyunSecBot User-agent: Amazonbot User-agent: anthropic-ai User-agent: Applebot User-agent: Baiduspider User-agent: Barkrowler User-agent: BLEXBot User-agent: Bytespider User-agent: CCBot User-agent: ChatGPT-User User-agent: ClaudeBot User-agent: CrawlBot User-agent: DataForSeoBot User-agent: DotBot User-agent: DuckDuckGo User-agent: Exabot User-agent: facebookexternalhit User-agent: FacebookBot User-agent: Facebot User-agent: FeedFetcher-Google User-agent: Freshbot User-agent: Googlebot-Image User-agent: GoogleOther User-agent: GPTBot User-agent: Grapeshot User-agent: ias-va User-agent: Mediapartners-Google User-agent: meta-externalagent User-agent: MJ12bot User-agent: newsai User-agent: PetalBot User-agent: Pinterestbot User-agent: proximic User-agent: Scrapy User-agent: SEBot-WA User-agent: SemrushBot User-agent: SentiBot User-agent: SeoSiteCheckup User-agent: SERankingBacklinksBot User-agent: Sogou User-agent: TelegramBot User-agent: TikTokSpider User-agent: Trendictionbot User-agent: Twitterbot User-agent: Verity User-agent: YandexBot Crawl-delay: 300 Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /cache/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-login.php Disallow: /private/ Disallow: /admin/ Disallow: /includes/ Disallow: /config/ Allow: / # Default rules for all other bots User-agent: * Crawl-delay: 60 Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /cache/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-login.php Disallow: /private/ Disallow: /admin/ Disallow: /includes/ Disallow: /config/ Allow: /