# robots.txt file for seoimage.itsolution.sg # Last updated: February 2026 # Purpose: Optimize crawler access while protecting sensitive site areas. # Section 1: General User-Agent Rules User-agent: * Allow: / Allow: /privacy-policy Allow: /terms-of-service Allow: /cookie-policy Allow: /disclaimer Allow: /login Allow: /register Allow: /public/ Disallow: /admin/ Disallow: /api/ Disallow: /temp/ Disallow: /test/ Disallow: /backup/ Crawl-delay: 10 # Section 2: AI Crawler Rules # GPTBot (OpenAI's crawler for content training) User-agent: GPTBot Disallow: /admin/ Disallow: /api/ Disallow: /private/ Allow: / Crawl-delay: 10 # ChatGPT-User (AI visiting sites for ChatGPT results) User-agent: ChatGPT-User Disallow: /admin/ Disallow: /api/ Disallow: /private/ Allow: / Crawl-delay: 10 # Claude-Web (Anthropic's AI web crawler) User-agent: Claude-Web Disallow: /admin/ Disallow: /api/ Disallow: /private/ Allow: / Crawl-delay: 5 # anthropic-ai (General Anthropic bot) User-agent: anthropic-ai Disallow: /admin/ Disallow: /api/ Disallow: /private/ Allow: / Crawl-delay: 10 # CCBot (Common Crawl - used by various AI tools) User-agent: CCBot Disallow: /admin/ Disallow: /api/ Disallow: /private/ Disallow: /test/ Allow: / Crawl-delay: 5 # Google-Extended (Google's AI indexing bot) User-agent: Google-Extended Disallow: /admin/ Disallow: /api/ Disallow: /private/ Allow: / Crawl-delay: 2 # Googlebot (Standard Google crawler) User-agent: Googlebot Allow: / Disallow: /admin/ Disallow: /api/ # Bingbot (Microsoft Bing crawler) User-agent: Bingbot Allow: / Disallow: /admin/ Disallow: /api/ # Section 3: Sitemap and Additional Files Sitemap: https://seoimage.itsolution.sg/sitemap.xml # Section 4: AI Policy Files # AI systems should also check: # - /ai.txt for AI interaction policies # - /llms.txt for LLM usage guidelines