# llms.txt

## About
https://www.speedhausauto.com/: Speedhaus Automotive specializes in automotive performance upgrades, maintenance, and engine repairs, with a focus on European vehicles such as Mercedes, BMW, Porsche, and Audi. They offer expert installation of high-performance parts, ECU tuning, and turbo installations. Their services also include comprehensive vehicle care, customizations, and routine maintenance. Speedhaus serves customers in Thousand Oaks, Malibu, and Newbury Park, California.

## 1. LLM Directives
- User-agent: *
- Allow: /
- Disallow: /register
- Disallow: /checkout
- Disallow: /login
- Disallow: /api/
- Disallow: /wp-admin
- Disallow: /wp-login.php
- Crawl-delay: 5
- Sitemap: https://www.speedhausauto.com/sitemap.xml

## 2. Site Structure
Please provide the set of URLs for processing.
    
## 3. Pages to Avoid

Avoid crawling any paths that:

- Require authentication or sessions (/account/, /dashboard, /admin/, /wp-login.php, etc.)
- Are checkout/cart related or intermediate forms
- Include query string heavy endpoints (?, search, filters, session IDs)
- Are asset-heavy (e.g., images, CSS/JS aggressively) to focus LLM bandwidth on text

## 4. Guidance for Developers
- Ensure XML sitemap is up-to-date and accessible at /sitemap.xml or /sitemap_index.xml
- Steer clear of overly dynamic URLs; prefer human-readable static URLs
- Embed contextual metadata (e.g., schema.org Product, Breadcrumbs) to improve semantic parsing
- Group content into clear sections (Products, Services, About, Blog, Careers) for structured LLM consumption
- Include <meta name="robots" content="index, follow"> on priority pages
- For non-priority/form pages, use <meta name="robots" content="noindex, noreferrer"> or server-side restrictions
- Regularly monitor server logs for crawler behavior; tune Crawl-delay or disallowed paths as needed
- Log and monitor LLM crawl behavior
- Ensure consistency with robots.txt and sitemap.xml
- Version control this llms.txt alongside robots.txt and sitemap.xml to align site-crawling policies

## 5. Summary and Best Practices

- **Permit** LLM crawling for all public, content‑rich pages.
- **Prevent** crawling of private or form‑filled endpoints.
- Use **Crawl‑delay** and **Clean‑param** to protect server load and avoid duplicate content.
- Keep **__priority list__** in sync with sitemap updates.
- **Document and version** llms.txt for alignment across SEO, dev, and content teams.

**Last Updated:** 7 Oct 2025