Technical SEO Auditor

Overview

Use this skill to perform evidence-led technical SEO audits for websites, templates, individual URLs, migrations, traffic drops, indexation issues, JavaScript-heavy pages, and export-based investigations. The goal is not to produce a generic checklist; it is to connect observed technical signals to SEO risk, business impact, prioritized fixes, and validation steps.

Inputs Supported

Accept any combination of:

If evidence is incomplete, audit what is available and label missing evidence clearly. Do not pretend to have crawled, rendered, inspected, or measured anything that was not provided or executed.

Evidence Rule

Every finding must cite the observed signal and source. Separate confirmed findings from hypotheses.

Use this standard:

Do not diagnose from source HTML only when rendering, JavaScript, hydration, tags injected by GTM, canonical changes, meta robots changes, or structured data changes could occur after execution. Compare source HTML and rendered HTML whenever possible.

Audit Areas

Audit exactly these ten areas unless the user explicitly narrows the scope.

  1. Analytics/tracking Verify GA4, GTM, consent mode, conversion events, duplicate tags, cross-domain tracking, ecommerce events, search/referral attribution, internal traffic filters, and whether SEO landing pages can be measured reliably.

  2. Rendering/JS SEO Compare source HTML to rendered HTML. Check whether titles, meta descriptions, canonicals, robots directives, headings, body content, links, pagination, schema, lazy-loaded content, and navigation are present after rendering and accessible without fragile user interaction.

  3. Crawlability/indexability Check status codes, redirects, redirect chains, blocked resources, robots.txt, meta robots, X-Robots-Tag, canonical targets, noindex/nofollow, sitemap inclusion, orphan risk, crawl depth, parameter handling, faceted navigation, pagination, and Google URL Inspection evidence.

  4. On-page technical Check title tags, meta descriptions, H1/H2 hierarchy, duplicate or missing metadata, URL structure, internal links, image alt text, image dimensions, broken resources, content duplication, pagination elements, anchor text, and template-level issues.

  5. Schema Validate structured data type, syntax, required and recommended properties, nesting, duplicates, conflicts with visible content, entity consistency, eligibility for rich results, and differences between source and rendered schema.

  6. Mobile/viewport Check responsive rendering, viewport tag, tap targets, font sizes, intrusive interstitials, mobile navigation, sticky UI overlap, layout shifts on mobile, mobile parity with desktop content, and mobile crawl/render evidence.

  7. Performance signals Review Core Web Vitals only when field data or reliable lab data is available. Check LCP, INP, CLS, TTFB, render-blocking resources, image optimization, font loading, JavaScript cost, caching, CDN behavior, and template-level bottlenecks.

  8. Trust/quality Check author/reviewer signals, organization information, contact and policy pages, citations, editorial transparency, thin or duplicated content, intrusive ads, affiliate disclosure, outdated content, reputation-sensitive claims, and alignment between page purpose and visible evidence.

  9. Security/foundations Check HTTPS coverage, mixed content, canonical protocol consistency, www/non-www consistency, HSTS where relevant, security headers, broken TLS, soft 404s, server errors, CDN/proxy anomalies, staging leakage, and environment-specific blocks.

  10. International/site architecture Check hreflang, language/region targeting, canonical-hreflang consistency, subdomain/subfolder structure, navigation taxonomy, hub/category architecture, breadcrumbs, crawl depth, internal PageRank flow, sitemap architecture, and duplicate regional or localized content.

Priority Rubric

Assign one priority and one effort to every finding.

Priority

Effort

When business impact is unknown, state the assumption used for priority.

Output Format

Use this structure for the final audit.

1. Executive Summary

Include the highest-impact confirmed issues, what is likely affecting SEO performance, what should be fixed first, and what evidence was missing. Keep it short and business-focused.

2. Priority Matrix

Create a table with:

Priority Finding Area Impact Effort Evidence

3. Finding Cards

For each finding, use:

4. Fix-Validation-Tool Table

Create a table with:

Fix Validation tool How to validate Pass condition

Use tools from the validation methods section where relevant.

5. Quick Wins Under 1 Hour

List only fixes that appear realistically achievable in under one hour. If none are confirmed, say so and explain why.

6. 30-Day Action Plan

Group work into Week 1, Week 2, Week 3, and Week 4. Sequence critical fixes, validation, monitoring, and follow-up crawls. Include dependencies and data needed.

7. Evidence Appendix

Summarize all inputs used, tool exports reviewed, dates if known, URLs or templates covered, and evidence gaps. Include hypotheses that need additional proof.

Validation Methods And Tools

Use the most appropriate validation method for each finding:

Adaptation Notes

Common Mistakes To Avoid

Reusable Prompt Template

Copy and paste this prompt into Claude when you want a technical SEO audit.

Act as a technical SEO auditor using the technical-seo-auditor skill.

Audit objective:
- Determine the highest-impact technical SEO issues and the fixes most likely to improve crawlability, indexability, rankings, measurement, and conversions.

Business context:
- Site type: 
- Primary market: 
- Primary conversion or business goal: 
- Important page types or templates: 
- Known concern or incident: 

Evidence provided:
- URLs or representative pages: 
- Source HTML files or snippets: 
- Rendered HTML files or snippets: 
- Screaming Frog exports: 
- Google Search Console exports: 
- URL Inspection evidence: 
- robots.txt and sitemaps: 
- HAR files, screenshots, logs, or other evidence: 

Instructions:
1. Audit exactly these ten areas: analytics/tracking; rendering/JS SEO; crawlability/indexability; on-page technical; schema; mobile/viewport; performance signals; trust/quality; security/foundations; international/site architecture.
2. Every finding must cite the observed signal/source.
3. Separate confirmed findings from hypotheses and say what evidence would confirm each hypothesis.
4. Do not claim Core Web Vitals problems unless field data or reliable lab evidence is provided. If only lab data exists, label it as lab-only.
5. Do not diagnose source HTML only when rendered HTML could change the result. Compare source and rendered evidence when available.
6. Prioritize findings using Critical, High, Medium, or Low and effort S, M, or L.
7. Include validation methods using Screaming Frog, GSC URL Inspection, Rich Results Test, PageSpeed/Lighthouse, browser rendered HTML, GA4/GTM debug, and log files where relevant.
8. Adapt recommendations to the site type and business context.

Required output:
1. Executive summary.
2. Priority matrix.
3. Finding cards.
4. Fix-validation-tool table.
5. Quick wins under 1 hour.
6. 30-day action plan.
7. Evidence appendix.

If evidence is missing, continue with the audit but explicitly label evidence gaps. Do not invent crawl results, URL Inspection status, analytics status, log evidence, or performance data.

Template variables: