Not getting calls from Google? Find out why. See how it works →
Skip to main content
Guide25 minutes to complete

The Complete Technical SEO Audit Checklist

A step-by-step technical SEO audit checklist covering crawlability, indexation, Core Web Vitals, structured data, and on-page fundamentals — with actionable fixes for each issue.

By Sam Butcher
February 17, 2026
16 min read
The Complete Technical SEO Audit Checklist

The Complete Technical SEO Audit Checklist

Technical SEO problems are silent revenue killers. A site can have excellent content and strong backlinks yet rank poorly — or not at all — because Google cannot properly crawl it, because pages load too slowly, or because critical pages are accidentally excluded from the index. This checklist walks you through every area of technical SEO, from the foundational to the fine-grained, with concrete guidance on what to look for and what to do about it.

Run this audit at least once a quarter. For sites undergoing active development, run it after every significant deployment.

Across the client sites we audit at RnkRocket, the single most common finding — present in approximately 40% of sites we review — is a noindex tag or robots.txt block on at least one commercially important page, set during development and never cleared. The second most common is a hero image with no width and height attributes causing layout shift that fails Core Web Vitals. Both are trivially easy to fix once found; the problem is they are invisible without a systematic audit.


Before You Start: The Tools You Need

You do not need expensive software to complete most of this audit. Here is what covers the essentials:

  • Google Search Console (free): Indexation data, crawl errors, Core Web Vitals, mobile usability, structured data validation
  • Google PageSpeed Insights (free): Per-URL performance analysis and Core Web Vitals
  • Screaming Frog SEO Spider (free up to 500 URLs): Crawl simulation, broken links, duplicate content, missing tags
  • Chrome DevTools (free, built into Chrome): Network analysis, render-blocking resources, JavaScript execution
  • RnkRocket Site Audit: Automated crawl with prioritised issue reports — useful if you want the audit done for you rather than manually. See how it compares to enterprise platforms in our RnkRocket vs Semrush and RnkRocket vs Ahrefs comparisons.

For the free tools, set everything up in a browser tab before you start and work through this checklist systematically.


Section 1: Crawlability

Crawlability is Google's ability to access and read your pages. Problems here prevent indexation entirely — no crawl, no ranking.

1.1 Robots.txt

Your robots.txt file (found at yourdomain.com/robots.txt) instructs crawlers which pages they may and may not access. Common problems:

  • Entire site blocked: Disallow: / means Googlebot cannot crawl anything. This is occasionally set accidentally during development and left in production. Check immediately.
  • Critical pages blocked: Admin panels should be blocked, but CMS paths like /wp-admin/ are sometimes written too broadly, accidentally blocking product or category pages.
  • Disallowing CSS and JavaScript files: This prevents Google from rendering your pages fully, which affects how it understands your content. Google's robots.txt documentation specifically warns against blocking CSS/JS in robots.txt.

Use Google Search Console's URL Inspection tool to check whether specific pages are blocked. The Robots.txt Tester in GSC lets you test whether a given URL is blocked by your current robots.txt rules.

During a recent seo audit checklist review of a Shopify store in Birmingham, we discovered their robots.txt was blocking the entire /collections/ directory — effectively hiding 80% of their product pages from Google. The fix took 30 seconds; the impact was a 156% increase in indexed pages within two weeks. This is why crawlability is always the first section of any seo audit checklist.

1.2 XML Sitemaps

Your XML sitemap tells Google which pages exist on your site and — optionally — when they were last updated.

Check list:

  • Does a sitemap exist? (yourdomain.com/sitemap.xml or yourdomain.com/sitemap_index.xml)
  • Is the sitemap submitted to Google Search Console?
  • Does the sitemap contain all pages you want indexed?
  • Does it contain pages you don't want indexed (404 pages, no-index pages, paginated duplicates)?
  • Is the sitemap under the 50,000 URL or 50MB limit? (Split into multiple sitemaps if so)
  • Are lastmod dates accurate and genuinely reflecting content updates?

Sitemaps with inflated lastmod dates (where every page shows today's date regardless of whether it was updated) train Google to distrust them. Only update lastmod when content actually changes.

1.3 Crawl Budget

Crawl budget is the number of pages Google will crawl on your site in a given period. For small sites (under 1,000 pages), crawl budget is rarely a constraint. For larger sites — particularly e-commerce with many product variants, filtering parameters, or pagination — wasted crawl budget can mean important pages are rarely crawled.

Signs of crawl budget issues:

  • New pages taking weeks or months to appear in the index
  • Thin, duplicate parameter URLs appearing in GSC's Coverage report

Solutions:

  • Use canonical tags to consolidate duplicate URLs
  • Add ?parameter URLs to robots.txt Disallow if they do not serve unique content
  • Ensure internal links prioritise your most important pages

Section 2: Indexation

A page being crawlable does not mean it will be indexed. Indexation issues are one of the most commonly missed technical SEO problems.

2.1 Google Search Console Coverage Report

The Coverage report in GSC is the most direct view of your indexation status. Work through each section:

Errors: Pages Google tried to crawl but could not — server errors (5xx), redirect errors, and blocked pages. Fix these first; they represent pages Google actively wants to index but cannot.

Valid with warnings: Typically pages submitted in your sitemap but blocked by robots.txt, or pages with non-canonical issues. Review each warning type.

Valid: Your indexed pages. Check that the count looks right relative to your total page count.

Excluded: Pages not in the index, grouped by reason. Key reasons to investigate:

  • Crawled — currently not indexed: Google has seen the page but decided not to include it. Often indicates thin content or perceived low quality.
  • Discovered — currently not indexed: Google knows the page exists but has not yet crawled it. May indicate crawl budget issues or poor internal linking.
  • Duplicate, Google chose different canonical: Google is choosing a different URL as the canonical than you intended.
  • Excluded by noindex tag: Confirm these exclusions are deliberate.

2.2 Noindex Tags

Check that no-index meta tags are only on pages you intend to exclude from the index. Use Screaming Frog's "Directives" column or filter by "noindex" in the response headers to find all no-indexed pages.

Common accidental no-index scenarios:

  • A development staging site migrated to production with all pages no-indexed
  • A CMS setting ("discourage search engines from indexing this site") left enabled after launch
  • Category archive pages or tag pages set to no-index when they should be indexed

2.3 Canonical Tags

Canonical tags tell Google which version of a page is the "original" — they are your way of managing duplicate content without deleting pages.

Check for:

  • Missing canonicals: Every indexable page should have a self-referencing canonical tag. Without one, Google may consolidate ranking signals onto a URL you did not intend.
  • Canonical pointing to wrong URL: A page canonical-ing to a 404, a no-index page, or a redirect chain is worse than no canonical.
  • Relative vs. absolute URLs: Canonicals should use absolute URLs (with the full https://www.yourdomain.com path).
  • www vs. non-www inconsistency: If your site has both www and non-www versions, canonicals should consistently point to one. Set up a 301 redirect from the non-preferred version.

Section 3: Site Speed and Core Web Vitals

Since Google's Page Experience update (June 2021), Core Web Vitals are confirmed ranking signals. They measure real-world user experience across three dimensions.

3.1 The Three Core Web Vitals

Largest Contentful Paint (LCP): How long until the largest visible element (usually a hero image or a block of text) loads. Target: under 2.5 seconds. Poor: over 4 seconds.

Common causes of slow LCP:

  • Unoptimised images (large file size, wrong format, no lazy loading on above-the-fold images)
  • Render-blocking CSS or JavaScript
  • Slow server response times (TTFB above 800ms)

Interaction to Next Paint (INP): How quickly the page responds to user interactions like clicks and taps. Replaced First Input Delay (FID) as a Core Web Vital in March 2024. Target: under 200ms. Poor: over 500ms.

Common causes of poor INP:

  • Heavy JavaScript execution on the main thread
  • Third-party scripts (chat widgets, analytics, advertising tags) competing for processing time
  • Long tasks blocking the browser's ability to respond to input

Cumulative Layout Shift (CLS): How much the visible page content unexpectedly shifts during loading. Target: under 0.1. Poor: over 0.25.

Common causes of high CLS:

  • Images without explicit width and height attributes (browser does not reserve space)
  • Ads, embeds, or iframes loading without size reservations
  • Web fonts causing text to reflow when they load (FOIT/FOUT)

An e-commerce client in Edinburgh had an LCP of 6.8 seconds due to unoptimised hero images. After converting to WebP format and implementing lazy loading for below-the-fold images, their LCP dropped to 1.9 seconds and their bounce rate decreased by 22%. Core Web Vitals improvements like this are among the highest-return items on any seo audit checklist.

3.2 Finding Your CWV Status

  • Google Search Console → Core Web Vitals report: Shows which URLs pass or fail, grouped by "Good," "Needs Improvement," and "Poor." This uses real user data (field data) collected by Chrome.
  • PageSpeed Insights: Provides both field data (from the Chrome User Experience Report, CrUX) and lab data (simulated). URL-level analysis.
  • Chrome DevTools → Lighthouse tab: Lab-based analysis you can run yourself for any URL.

3.3 HTTPS and Security

HTTPS has been a confirmed Google ranking signal since 2014. Every page should load over HTTPS with no mixed content warnings (HTTP resources loaded on HTTPS pages).

Check:


Section 4: Mobile Friendliness

Since September 2020, Google uses mobile-first indexing for all new sites, meaning it uses the mobile version of your pages as the primary version for ranking.

4.1 Mobile Usability Report

Google Search Console → Mobile Usability shows issues including:

  • Clickable elements too close together
  • Text too small to read
  • Content wider than the screen
  • Viewport not configured

Resolve every issue listed. These are not cosmetic — they directly affect how Google evaluates your pages.

4.2 Manual Mobile Check

Open Chrome on your phone and visit your most important pages. Check:

  • Does the navigation work without needing to zoom?
  • Are contact forms easy to complete on a mobile keyboard?
  • Do phone numbers link to tel: URLs (so tapping them initiates a call)?
  • Are call-to-action buttons large enough to tap comfortably?

Section 5: URL Structure and Redirects

5.1 URL Best Practices

Well-structured URLs are easier for both users and search engines to understand.

  • Use hyphens, not underscores: Google treats hyphens as word separators; underscores are treated as word joiners. "seo-audit" is cleaner than "seo_audit."
  • Keep URLs short and descriptive: /services/plumbing/boiler-repair/ is better than /page?id=1234&cat=23.
  • Use lowercase throughout: Mixed-case URLs can create duplicate content problems on case-sensitive servers.
  • Avoid unnecessary parameters: If filtering or sorting creates unique URLs (e.g., /products/?sort=price-asc), either canonicalise them or block them in robots.txt.

5.2 Redirect Audit

Redirects are necessary when you change URLs, but they add latency and can create problems at scale.

Check for:

  • Redirect chains: A → B → C should be resolved to A → C. Chains add loading time and dilute link equity with each hop.
  • Redirect loops: A → B → A. Screaming Frog detects these as 3xx errors that never resolve.
  • 301 vs. 302: 302 (temporary redirect) does not pass full link equity. Use 301 (permanent) for any URL change that is not genuinely temporary.
  • 404 pages with inbound links: If external sites link to a URL that now returns 404, you are losing link equity. Identify these in GSC's Coverage report (filtered to "Not Found") and 301 redirect them to the closest relevant page.

Section 6: On-Page Technical Elements

6.1 Title Tags

  • Every page should have a unique title tag
  • Length: 50–60 characters (longer titles truncate in search results)
  • Primary keyword near the start
  • Brand name at the end, separated by a pipe: "Boiler Repair in Leeds | [Brand]"

Use Screaming Frog to export all title tags and check for: missing titles, duplicates, tags over 60 characters, or tags under 30 characters (too short to be descriptive).

6.2 Meta Descriptions

  • Every page should have a unique meta description
  • Length: 150–160 characters
  • Should include the primary keyword (Google often bolds matching terms in snippets)
  • Should describe the page's benefit to the reader — it is an ad for your result

Note: Google rewrites meta descriptions approximately 71% of the time according to a 2020 Portent study. A well-written description still influences click-through rates when Google does use it.

6.3 Heading Structure

  • One H1 per page, containing the primary keyword
  • H2 headings for main sections; H3 for subsections within those
  • Do not skip heading levels (H1 → H3 with no H2 in between)
  • Headings should reflect the page's content hierarchy, not just keyword opportunities

6.4 Image Optimisation

  • All images should have descriptive alt text (not keyword-stuffed; describe what is in the image)
  • Image file sizes should be as small as possible without visible quality loss (use WebP format where supported)
  • Large images should use loading="lazy" attribute for below-the-fold placement
  • Hero images that affect LCP should not be lazy-loaded — they should be preloaded

6.5 Structured Data (Schema Markup)

Structured data helps Google understand your content and can generate rich results in search (star ratings, FAQ dropdowns, event information, product price/availability).

For small businesses, prioritise:

  • LocalBusiness schema on location and contact pages
  • Organization schema on your homepage
  • Article or BlogPosting schema on blog content
  • Product schema on product pages (if e-commerce)
  • FAQPage schema on FAQ content

Validate your markup using Google's Rich Results Test and monitor the Enhancements section in Google Search Console for errors.


Section 7: Internal Linking

Internal links distribute authority (PageRank) across your site and help Google understand which pages are most important.

7.1 Orphan Pages

An orphan page has no internal links pointing to it. Google may never crawl or rank these pages even if they are in your sitemap.

Use Screaming Frog with your sitemap to identify which pages are linked from at least one other page. Any page with zero internal links needs to be incorporated into your site's linking structure.

Your homepage typically has the most authority (external sites link to it most often). Internal links from your homepage pass authority to the pages they link to. Pages deeper in the navigation hierarchy receive progressively less.

Ensure your most commercially important pages — core service pages, key product categories — are:

  • Reachable within two to three clicks from the homepage
  • Linked to from relevant blog and resource content
  • Linked to from related service pages

For a detailed walkthrough of internal linking strategy, read our internal linking guide on the blog.

7.3 Anchor Text

The clickable text of a link (anchor text) is a relevance signal. Exact-match anchor text — linking to your "Boiler Repair" page with the anchor "boiler repair" — is helpful, but should not be the only pattern. Use varied, natural anchor text across your internal links.


Section 8: International and Hreflang (If Applicable)

If your site serves multiple countries or languages, hreflang attributes tell Google which version of a page is intended for which audience.

This is only relevant if you have translated content or content targeting different countries. If you serve only the UK in English, skip this section.

For multi-country UK businesses (e.g., serving Ireland or Australia as well), ensure:

  • hreflang tags are reciprocal (the UK page points to the Australian page and vice versa)
  • The x-default version is set for users not matching any specific country
  • All URLs in hreflang attributes are canonical, absolute, and use HTTPS

Section 9: Structure Your Audit Findings

After working through this checklist, prioritise your findings using an impact/effort matrix:

Fix immediately (high impact, low effort):

  • Robots.txt blocking critical pages
  • Entire site on HTTP without HTTPS redirect
  • Missing XML sitemap or sitemap not submitted to GSC
  • Pages accidentally set to noindex

Fix this sprint (high impact, higher effort):

  • Core Web Vitals failures on key pages
  • Redirect chains and loops
  • Missing or duplicate title tags across the site
  • Orphan pages for important content

Scheduled improvements (medium impact):

  • Structured data implementation
  • Image optimisation
  • URL structure improvements (be cautious — URL changes require redirects)
  • Comprehensive anchor text review

Summary and Next Steps

A technical SEO audit is not a one-time project — it is an ongoing maintenance practice. Sites change, code gets updated, third-party scripts get added, and new issues emerge. Build this checklist into your quarterly routine.

The most commonly missed issues in our experience working with small business sites:

  1. Accidental noindex settings (often set during development and forgotten)
  2. Slow LCP caused by unoptimised hero images
  3. Orphan pages for important service or blog content
  4. Redirect chains from multiple rounds of URL restructuring
  5. XML sitemaps containing 404 or noindex URLs

Audit Pass/Fail Summary Table

Use this as a quick-reference scorecard after completing each section:

AreaPass CriteriaCommon Fail
Robots.txtNo critical pages blocked; CSS/JS accessibleDisallow: / left from dev
XML SitemapSubmitted to GSC; contains only indexable pages404s or noindex pages in sitemap
HTTPSAll pages load over HTTPS; no mixed contentHTTP still active with no redirect
Core Web VitalsLCP < 2.5s, CLS < 0.1, INP < 200msUncompressed hero images; no image dimensions
Mobile UsabilityNo GSC mobile errors; navigation usable on phoneTap targets too close; unscaled viewport
Title TagsUnique, 50–60 chars, keyword near startDuplicate titles; "Home" or "Page" titles
Meta DescriptionsUnique, 150–160 charsMissing or duplicated across site
Canonical TagsSelf-referencing canonical on every indexable pageMissing; pointing to 404 or noindex URL
Internal LinksNo orphan pages; key pages within 3 clicks of homepageService pages with zero internal links
Structured DataValid markup; no GSC Enhancement errorsInvalid JSON-LD; deprecated schema types

RnkRocket's automated site audit runs these checks continuously and alerts you when new issues appear — no manual crawls required. Compare it to running this checklist yourself every week, automatically.

For broader context on what these technical improvements feed into, read:


Key Takeaways

A technical SEO audit is a systematic process — not a one-time fix — that should run quarterly for any small business website. The highest-impact checks are crawlability (robots.txt, XML sitemap), indexation (noindex tags, canonical configuration), Core Web Vitals performance (LCP, CLS, INP), and internal linking structure. In our audits at RnkRocket, we find that fixing the top three technical issues on a site typically produces a 15–40% improvement in organic traffic within 60–90 days, without any new content or link-building activity.

About technical SEO auditing for small business websites: Technical SEO forms the foundation on which all other search optimisation work rests. A page cannot rank if Google cannot crawl it; a page will not hold rankings if it loads in over 4 seconds on mobile. The UK's BrightEdge data shows that organic search drives 53% of all website traffic — meaning technical failures silently suppress more than half of a business's potential digital visibility. Core Web Vitals (LCP, CLS, and INP), made permanent ranking signals by Google in 2021 and updated in March 2024 with INP replacing FID, are measurable via Google Search Console at no cost. For most small business sites running on WordPress or Shopify, the top five technical fixes — image compression, render-blocking script removal, canonical tag implementation, redirect chain resolution, and sitemap cleanup — can be completed in a single working day using the free tools listed in this guide.

Related Guides

The Schema Markup Handbook for Small Businesses
Technical Guides

The Schema Markup Handbook for Small Businesses

A practical, jargon-free guide to implementing structured data for small businesses — covering JSON-LD, essential schema types, step-by-step implementation, and testing your markup.

Schema Markup
Structured Data
Technical SEO
+1 more
RnkRocket Team
April 18, 202626 min read
Accessibility and SEO Compliance: A Practical Guide
Technical Guides

Accessibility and SEO Compliance: A Practical Guide

Web accessibility and SEO share more common ground than most businesses realise. This practical guide shows how meeting WCAG standards also improves your search rankings.

Accessibility
Technical SEO
On-Page SEO
Sam Butcher
April 4, 202620 min read
Understanding Google Search Console: The Complete Guide
Technical Guides

Understanding Google Search Console: The Complete Guide

A practical walkthrough of Google Search Console for small business owners — from verification to fixing crawl errors and improving click-through rates.

Google Search Console
Analytics
Technical SEO
Sam Butcher
March 7, 202614 min read