# robots.txt — Ammann (version 1, 2025-08-28) # ---------------------------------------------------- # 1. Specific group for Googlebot User-agent: Googlebot Allow: /_next/static/ Allow: /*.css$ Allow: /*.js$ Disallow: /login Disallow: /api/ Disallow: /search Disallow: /*?state= Disallow: /*?session_id= # 2. Specific group for Bingbot User-agent: Bingbot Allow: /_next/static/ Allow: /*.css$ Allow: /*.js$ Disallow: /login Disallow: /api/ Disallow: /search # 3. Specific group for Yandex User-agent: Yandex # Remove tracking and session duplicates Clean-param: utm_source&utm_medium&utm_campaign / Clean-param: state /login Disallow: /login Disallow: /api/ Disallow: /search # 4. Specific group for Baiduspider User-agent: Baiduspider Disallow: /login Disallow: /api/ Disallow: /search # 5. Fallback group for all other crawlers User-agent: * Disallow: /login Disallow: /api/ Disallow: /search Disallow: /internal/ Disallow: /admin/ Disallow: /cart Disallow: /checkout Disallow: /servicelink/ Disallow: /shop/ Disallow: /my/ Disallow: /www-staging/ # 6. Sitemap index Sitemap: https://www.ammann.com/sitemap.xml