# Robots.txt per ConteinCar Design # https://conteincar.com/robots.txt User-agent: * Allow: / # Disallow admin and sensitive areas Disallow: /admin/ Disallow: /private/ Disallow: /temp/ Disallow: /backup/ # Allow specific crawling of important pages Allow: / Allow: /blog/ Allow: /assets/ # Sitemap location Sitemap: https://conteincar.com/sitemap.xml # Crawl-delay for polite crawling Crawl-delay: 1 # Specific rules for different search engines User-agent: Googlebot Crawl-delay: 0 Allow: / User-agent: Bingbot Crawl-delay: 1 Allow: / User-agent: Slurp Crawl-delay: 2 Allow: / # Block malicious bots User-agent: BadBot Disallow: / User-agent: MJ12bot Disallow: / User-agent: SemrushBot Disallow: / User-agent: AhrefsBot Disallow: / # Allow social media crawlers User-agent: facebookexternalhit Allow: / User-agent: Twitterbot Allow: / User-agent: LinkedInBot Allow: / # Disallow file types that shouldn't be indexed Disallow: /*.json$ Disallow: /*.xml$ Disallow: /*.txt$ Disallow: /*.log$