SEO Audit Skill
Audit websites for SEO, technical, content, performance, and security issues using the SEOmator CLI.
SEOmator provides comprehensive website auditing by analyzing website structure and content against 134 rules across 18 categories.
It provides a list of issues with severity levels, affected URLs, and actionable fix suggestions.
Links
What This Skill Does
This skill enables AI agents to audit websites for 134 rules in 18 categories, including:
- Core SEO: Canonical URLs, indexing directives, title uniqueness
- Meta Tags: Title, description, viewport, favicon, canonical
- Headings: H1 presence, heading hierarchy, keyword usage
- Technical SEO: robots.txt, sitemap.xml, URL structure, 404 pages
- Core Web Vitals: LCP, CLS, FCP, TTFB, INP measurements
- Links: Broken links, redirect chains, anchor text, orphan pages
- Images: Alt text, dimensions, lazy loading, modern formats
- Security: HTTPS, HSTS, CSP, external link safety, leaked secrets
- Structured Data: Schema.org markup, Article, Organization, FAQ, Product
- Social: Open Graph tags, Twitter cards, share buttons, profile links
- Content: Word count, readability, keyword density, author info
- Accessibility: ARIA labels, color contrast, form labels, landmarks
- Performance: DOM size, CSS optimization, font loading, preconnect
- Crawlability: Sitemap conflicts, indexability signals, canonical chains
- URL Structure: Keyword slugs, stop words
- Mobile: Font sizes, horizontal scroll, intrusive interstitials
- Internationalization: lang attribute, hreflang tags
- Legal Compliance: Cookie consent, privacy policy, terms of service
The audit crawls the website, analyzes each page against audit rules, and returns a comprehensive report with:
- Overall health score (0-100) with letter grade (A-F)
- Category breakdowns with pass/warn/fail counts
- Specific issues with affected URLs grouped by rule
- Actionable fix recommendations
When to Use
Use this skill when you need to:
- Analyze a website's SEO health
- Debug technical SEO issues
- Check for broken links
- Validate meta tags and structured data
- Audit security headers and HTTPS
- Check accessibility compliance
- Generate site audit reports
- Compare site health before/after changes
- Improve website performance, accessibility, SEO, security and more
Prerequisites
This skill requires the SEOmator CLI to be installed.
Installation
bash
npm install -g @seomator/seo-audit
Verify Installation
Check that seomator is installed and the system is ready:
This checks:
- Node.js version (18+ recommended)
- npm availability
- Chrome/Chromium for Core Web Vitals
- Write permissions for ~/.seomator
- Local config file presence
Setup
Running
creates a
config file in the current directory.
bash
seomator init # Interactive setup
seomator init -y # Use defaults
seomator init --preset blog # Blog-optimized config
seomator init --preset ecommerce # E-commerce config
seomator init --preset ci # Minimal CI config
If there is no
in the directory, CREATE ONE with
before running audits.
Usage
AI Agent Best Practices
YOU SHOULD always prefer - it provides token-optimized XML output specifically designed for AI agents (50-70% smaller than JSON).
When auditing:
- Prefer live websites over local dev servers for accurate performance and rendering data
- Use for faster audits when Core Web Vitals aren't needed
- Scope fixes as concurrent tasks when implementing multiple fixes
- Run typechecking/formatting after implementing fixes (tsc, eslint, prettier, etc.)
Website Discovery
If the user doesn't provide a website to audit:
- Check for local dev server configurations (package.json scripts, .env files)
- Look for Vercel/Netlify project links
- Check environment variables for deployment URLs
- Ask the user which URL to audit
If you have both local and live websites available, suggest auditing the live site for accurate results.
Basic Workflow
bash
# Quick single-page audit with LLM output
seomator audit https://example.com --format llm --no-cwv
# Multi-page crawl (up to 50 pages)
seomator audit https://example.com --crawl -m 50 --format llm --no-cwv
# Full audit with Core Web Vitals
seomator audit https://example.com --crawl -m 20 --format llm
Advanced Options
Force fresh crawl (ignore cache):
bash
seomator audit https://example.com --refresh --format llm
Resume interrupted crawl:
bash
seomator audit https://example.com --resume --format llm
Save HTML report for sharing:
bash
seomator audit https://example.com --format html -o report.html
Verbose output for debugging:
bash
seomator audit https://example.com --format llm -v
Command Reference
Audit Command Options
| Option | Alias | Description | Default |
|---|
| | Output format: console, json, html, markdown, llm | console |
| | Maximum pages to crawl | 10 |
| | Enable multi-page crawl | false |
| | Ignore cache, fetch fresh | false |
| | Resume interrupted crawl | false |
| | Skip Core Web Vitals | false |
| | Show progress | false |
| | Output file path | |
| | Config file path | |
| | Save to ~/.seomator | false |
Other Commands
bash
seomator init # Create config file
seomator self doctor # Check system setup
seomator config --list # Show all config values
seomator report --list # List past reports
seomator db stats # Show database statistics
Output Formats
| Format | Flag | Best For |
|---|
| console | | Human terminal output (default) |
| json | | CI/CD, programmatic processing |
| html | | Standalone reports, sharing |
| markdown | | Documentation, GitHub |
| llm | | AI agents (recommended) |
The
output is a compact XML format optimized for token efficiency:
- 50-70% smaller than JSON output
- Issues sorted by severity (critical first)
- Fix suggestions included for each issue
- Clean stdout for piping to AI tools
Examples
Example 1: Quick Audit with LLM Output
bash
# User asks: "Check example.com for SEO issues"
seomator audit https://example.com --format llm --no-cwv
Example 2: Deep Crawl for Large Site
bash
# User asks: "Do a thorough audit with up to 100 pages"
seomator audit https://example.com --crawl -m 100 --format llm --no-cwv
Example 3: Fresh Audit After Changes
bash
# User asks: "Re-audit the site, ignore cached results"
seomator audit https://example.com --refresh --format llm --no-cwv
Example 4: Generate Shareable Report
bash
# User asks: "Create an HTML report I can share"
seomator audit https://example.com --crawl -m 20 --format html -o seo-report.html
Evaluating Results
Score Ranges
| Score | Grade | Meaning |
|---|
| 90-100 | A | Excellent - Minor optimizations only |
| 80-89 | B | Good - Address warnings |
| 70-79 | C | Needs Work - Priority fixes required |
| 50-69 | D | Poor - Multiple critical issues |
| 0-49 | F | Critical - Major problems to resolve |
Priority Order (by category weight)
Fix issues in this order for maximum impact:
- Core Web Vitals (11%) - User experience + ranking
- Links (9%) - Internal linking structure
- Images (9%) - Performance + accessibility
- Security (9%) - Trust signals
- Meta Tags (8%) - Search visibility
- Technical SEO (8%) - Crawling foundation
- Structured Data (5%) - Rich snippets
- Accessibility (5%) - WCAG compliance
- Performance (5%) - Static optimization
- Content (5%) - Text quality
Fix by Severity
- Failures (status: "fail") - Must fix immediately
- Warnings (status: "warn") - Should fix soon
- Passes (status: "pass") - No action needed
Output Summary
After implementing fixes, give the user a summary of all changes made.
When planning scope, organize tasks so they can run concurrently as sub-agents to speed up implementation.
Troubleshooting
seomator command not found
If you see this error, seomator is not installed or not in your PATH.
Solution:
bash
npm install -g @seomator/seo-audit
Core Web Vitals not measured
If CWV metrics are missing, Chrome/Chromium may not be available.
Solution:
- Install Chrome, Chromium, or Edge
- Run to verify browser detection
- Use to skip CWV if not needed
Crawl timeout or slow performance
For large sites, audits may take several minutes.
Solution:
- Use to see progress
- Limit pages with for faster results
- Use to skip browser-based measurements
Invalid URL
Ensure the URL includes the protocol:
bash
# Wrong
seomator audit example.com
# Correct
seomator audit https://example.com
How It Works
- Fetch: Downloads the page HTML and measures response time
- Parse: Extracts DOM, meta tags, links, images, structured data
- Crawl (if enabled): Discovers and fetches linked pages
- Analyze: Runs 134 audit rules against each page
- Score: Calculates category and overall scores
- Report: Generates output in requested format
Results are stored in
for later retrieval with
.
Resources
- Full rules reference: See for all 134 rules
- Storage architecture: See
docs/STORAGE-ARCHITECTURE.md
for database details
- CLI help: and
seomator <command> --help