Skip to main content
Technical SEO

What Does a Technical SEO Audit Actually Include?

Most businesses know they need an SEO audit but do not know what it should include. Here is exactly what I check in a technical SEO audit, why each element matters, and how I prioritize fixes.

Rustom Gutierrez

Rustom Gutierrez

Senior SEO Specialist

5 April 2026 14 min read
Share this article

A technical SEO audit reviews your website's crawlability, indexation, Core Web Vitals, site architecture, mobile usability, and backlink profile to identify issues preventing pages from ranking. A thorough audit takes 3-5 business days and forms the foundation for all subsequent SEO work.

Why Technical SEO Is the Foundation

Before investing in content creation, link building, or any other SEO activity, your website's technical foundation needs to be sound. If Google cannot crawl and index your pages efficiently, no amount of content or backlinks will help you rank.

I have completed over 40 technical SEO audits across e-commerce sites, SaaS platforms, service businesses, local businesses, and corporate websites. Every audit follows the same structured methodology, but the findings are always unique to the site. A 10-page service website has different technical challenges than a 50,000-page e-commerce store.

This guide explains exactly what I check, why each element matters, and how I prioritize which issues to fix first.

What I Check in Every Technical Audit

1. Crawlability: Can Google Find Your Pages?

Crawlability is the most fundamental technical SEO element. If Googlebot cannot access and navigate your website, nothing else matters.

Robots.txt analysis: I review your robots.txt file to ensure nothing important is accidentally blocked. Common issues I find include:

  • CSS or JavaScript files blocked from crawling (prevents Google from rendering pages correctly)
  • Entire directories blocked that contain important content
  • Missing sitemap reference in robots.txt
  • Conflicting directives that create ambiguous crawl instructions

XML Sitemap validation: Your sitemap tells Google which pages exist and when they were last updated. I verify:

  • The sitemap exists and is submitted to Google Search Console
  • It includes all important pages (and excludes non-indexable pages like paginated archives or filtered URLs)
  • Last modified dates are accurate (not all set to today's date)
  • The sitemap is under the 50,000 URL / 50MB limit
  • There are no URLs in the sitemap that return errors

Crawl errors: Using Google Search Console data, I identify:

  • Server errors (5xx) — pages that fail to load
  • Soft 404s — pages that return a 200 status but display error content
  • Redirect loops or chains — URLs that bounce through multiple redirects before reaching content
  • DNS resolution issues

Internal link structure: I analyze how pages connect to each other:

  • Orphan pages — pages with no internal links pointing to them (Google may never discover these)
  • Click depth — are important pages reachable within 3 clicks from the homepage?
  • Broken internal links — links pointing to pages that no longer exist
  • Logical hierarchy — does the site structure make sense for both users and crawlers?

2. Indexation: Are Your Pages in Google's Index?

A page that is crawlable is not necessarily indexed. Google may choose not to index a page for various reasons, and understanding why is critical.

Index coverage report: I review Google Search Console's coverage data to identify:

  • How many pages are indexed vs excluded
  • Why pages are excluded (noindex tag, canonical to another URL, crawled but not indexed, duplicate content)
  • Whether important pages are among the excluded ones

Noindex tags: I check every page for unintentional noindex meta tags. This is one of the most common issues I find — developers add noindex during staging and forget to remove it before launch.

Canonical tags: Canonical tags tell Google which version of a page is the "official" version. Misconfigurations I look for include:

  • Missing canonical tags (every page should have a self-referencing canonical)
  • Canonical tags pointing to the wrong URL
  • Canonical chains (page A canonicals to page B, which canonicals to page C)
  • HTTP/HTTPS or www/non-www canonical mismatches

Duplicate content: I identify pages serving the same or substantially similar content, which can dilute your ranking signals across multiple URLs.

3. Core Web Vitals: How Fast and Stable Is Your Site?

Google uses three Core Web Vitals as ranking signals. I test these using PageSpeed Insights for lab data and Search Console's Core Web Vitals report for real-world data.

Largest Contentful Paint (LCP): Measures how fast the main content loads.

  • Good: under 2.5 seconds
  • Needs improvement: 2.5-4.0 seconds
  • Poor: over 4.0 seconds
  • Common causes of slow LCP: unoptimized images, render-blocking CSS/JS, slow server response time, large DOM size

Cumulative Layout Shift (CLS): Measures visual stability — whether content shifts around while loading.

  • Good: under 0.1
  • Common causes: images without explicit dimensions, ads or embeds loading late, dynamically injected content, web fonts causing text reflow

Interaction to Next Paint (INP): Measures responsiveness to user input.

  • Good: under 200 milliseconds
  • Common causes: heavy JavaScript execution, long tasks blocking the main thread, unoptimized event handlers

4. On-Page Elements

I scan every page for on-page SEO issues:

  • Missing or duplicate title tags across the site
  • Missing or duplicate meta descriptions
  • Broken heading hierarchy (missing H1, multiple H1s, skipped levels like H1 then H3)
  • Images without alt text (accessibility and SEO issue)
  • Pages with thin content (under 300 words with no unique value)
  • Keyword cannibalisation — multiple pages targeting the same keyword

5. Site Architecture and URL Structure

  • URL structure: Are URLs clean, descriptive, and logically organised? Do they use hyphens (correct) or underscores (incorrect)?
  • Breadcrumbs: Are they implemented with proper BreadcrumbList schema?
  • Navigation: Is the main navigation crawlable? (JavaScript-rendered menus can sometimes be invisible to search engines)
  • Pagination: Are paginated pages handled correctly with proper canonical and navigation signals?

6. HTTPS and Security

  • Is the entire site served over HTTPS?
  • Are there mixed content warnings (HTTP resources loaded on HTTPS pages)?
  • Is the SSL certificate valid and not close to expiration?
  • Do HTTP URLs properly redirect to HTTPS?

7. Mobile Usability

Google uses mobile-first indexing, meaning it crawls and indexes the mobile version of your site. I verify:

  • Full responsive design across devices
  • Tap targets large enough for mobile interaction (48px minimum)
  • Text readable without zooming
  • No horizontal scrolling on mobile
  • Viewport meta tag properly configured

Using SEMrush, I assess the overall health of your backlink profile:

  • Total backlinks and referring domains
  • Anchor text distribution — is it natural or over-optimized?
  • Toxic or spammy links that could trigger a manual penalty
  • Recent link gains and losses
  • Comparison against competitors' link profiles

Want this done for you?

I handle technical SEO, content briefs, GBP optimization, and monthly reporting — starting at $900/mo.

See Packages

How I Prioritize Fixes

Not all issues are equally important. After completing the audit, I categorise every finding into three priority levels:

  • Critical (fix immediately): Issues preventing pages from being crawled or indexed — broken canonicals, accidental noindex tags, server errors, blocked resources. These directly prevent your content from appearing in search results.
  • High (fix this month): Issues impacting ranking potential — slow page speed, missing title tags, thin content, mobile usability problems, duplicate content. These reduce your ability to compete for rankings.
  • Medium (fix when possible): Issues that affect user experience or are best-practice violations but have smaller direct ranking impact — missing alt text, minor redirect chains, suboptimal URL structures, missing breadcrumb schema.

What the Deliverable Looks Like

My audit deliverable is a structured Google Doc — not a raw data dump from a crawling tool. Each finding includes:

  • The specific issue and which URLs are affected
  • Why it matters for SEO performance
  • The exact steps to fix it (with code snippets or CMS instructions where relevant)
  • The priority level (Critical / High / Medium)

For my full-service clients, I do not just deliver the audit — I implement the fixes. The audit becomes the action plan for the first month of work, and each subsequent monthly report tracks which issues have been resolved and what impact the fixes have had.

Frequently Asked Questions

How much does a technical SEO audit cost?

A professional technical SEO audit typically costs between $500 and $2,000 depending on site size and complexity. My Starter package at $900/month includes a full technical audit plus ongoing implementation of fixes.

How long does a technical SEO audit take?

A thorough audit takes 3-5 business days for a standard website (under 500 pages). Larger sites with thousands of pages may take 7-10 days. AI-powered tools significantly speed up the data collection phase.

How often should you run an SEO audit?

A full technical audit should be run at the start of any SEO engagement, then quarterly. Monthly monitoring of key metrics (crawl errors, Core Web Vitals, indexation) should happen continuously between full audits.

Get SEO tips in your inbox

Practical SEO strategies, Google algorithm updates, and AI search optimization tips. No spam.

Check your page SEO for free

Enter any URL and get an instant score with 10 on-page SEO checks.

Try Free Tool

Need help with your SEO?

Choose a package and get started today.

Get Started