# Robots.txt file for Lead Tracker CRM # 1. Allow all bots to access all public content User-agent: * Disallow: # 2. Disallow access to sensitive directories Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /admin/ Disallow: /scripts/ Disallow: /private/ Disallow: /config/ # 3. Allow Googlebot to access all content User-agent: Googlebot Disallow: # 4. Sitemap Location Sitemap: https://leadtrackercrm.com/sitemap.xml # 5. Allow Bingbot to access all content User-agent: Bingbot Disallow: # 6. Allow Yahoo Slurp to access all content User-agent: Slurp Disallow: # 7. Block malicious or unwanted crawlers User-agent: BadBot Disallow: / User-agent: EvilCrawler Disallow: / # 8. Block image indexing to save bandwidth User-agent: Googlebot-Image Disallow: / User-agent: Bingbot-Image Disallow: / User-agent: Slurp-Image Disallow: / # 9. Allow Google News to index content User-agent: Googlebot-News Disallow: # 10. Prevent duplicate content issues Disallow: /*?* Disallow: /*.php$ Disallow: /search/ # 11. Crawl-delay for specific bots (optional) User-agent: Bingbot Crawl-delay: 10 User-agent: Yandex Crawl-delay: 10 # 12. Blocking parameters for better crawl efficiency Disallow: /*?utm_source= Disallow: /*?sessionid=