# robots.txt # the only useful, allowed directory is wwwroot User-agent: rogerbot User-agent: dotbot User-agent: AhrefsBot User-agent: Yandex User-agent: Baidu User-agent: SemrushBot User-agent: PetalBot User-agent: LightspeedSystemsCrawler Disallow: / User-agent: * Crawl-delay: 10