SEO Audit Checklist: What to Fix, What to Ignore
An SEO audit is a structured review of every factor that affects how well your site ranks and how much organic traffic it earns. Done properly, it surfaces the specific technical, content, and authority problems holding your site back, so you can fix what matters instead of guessing.
Most audits fail not because the tools are wrong, but because the person running them doesn’t know which findings to act on. This checklist cuts through that. It covers the eight areas that consistently move the needle, with a clear view of what to prioritise and what to deprioritise when your time and budget are finite.
Key Takeaways
- An SEO audit is only useful if it produces a prioritised action list, not a raw report of 400 issues with no commercial weighting.
- Technical problems block ranking gains more often than content quality does. Fix crawlability and indexation before anything else.
- Thin content and keyword cannibalism are the two most common issues on sites that have been publishing for more than three years.
- Backlink audits are about removing toxic risk and identifying gaps, not chasing raw link volume.
- Most SEO tools surface the same data differently. The audit framework matters more than the platform you run it on.
In This Article
- Why Most SEO Audits Produce Reports, Not Results
- Section 1: Crawlability and Indexation
- Section 2: Technical SEO Foundations
- Section 3: On-Page SEO Audit
- Section 4: Content Quality Audit
- Section 5: Backlink Profile Audit
- Section 6: Local SEO Audit (If Applicable)
- Section 7: User Experience and Engagement Signals
- Section 8: Measurement and Reporting Setup
- How to Prioritise Your Audit Findings
- What a Good Audit Output Looks Like
Why Most SEO Audits Produce Reports, Not Results
I’ve sat in enough agency review meetings to know what a bad audit looks like. It’s a 60-slide deck with a traffic-light system, hundreds of flagged issues, and a priority column that labels everything as “high.” The client leaves with a document they can’t action because nobody has told them what to fix first or why it matters commercially.
The problem is structural. SEO tools are designed to find issues, not to rank them by business impact. Screaming Frog will happily tell you that 340 of your pages have meta descriptions over 160 characters. That’s technically true. It’s also almost entirely irrelevant if your site has 12 pages blocked from Googlebot and a core section of your content returning 404s.
A good audit is a triage exercise. You’re looking for the smallest number of fixes that will produce the largest improvement in organic visibility and traffic. Everything else goes on a backlog or gets ignored entirely.
This checklist is structured around that principle. Work through each section in order. The sections at the top have the highest potential to unblock ranking gains quickly. The sections toward the end are important but rarely the reason a site is underperforming.
If you’re building a broader SEO strategy rather than just running a one-off audit, the Complete SEO Strategy hub covers the full picture, from keyword research and content planning through to link building and measurement.
Section 1: Crawlability and Indexation
Before Google can rank your content, it needs to find it, crawl it, and decide it’s worth indexing. Failures at this stage mean everything else you do is wasted effort. This is where every audit should start.
Robots.txt review
Open your robots.txt file at yourdomain.com/robots.txt. Check that you’re not accidentally blocking Googlebot from crawling important sections of your site. This happens more often than you’d think, particularly after site migrations or CMS changes. A single misplaced disallow rule can take entire subdirectories out of the index.
Check that your sitemap is referenced in the file. It should be. If it isn’t, add it.
XML sitemap audit
Your sitemap should contain only the pages you want indexed. Audit it for: pages returning non-200 status codes, noindexed pages that have been included, and pages that are canonicalised away to a different URL. All three are common and all three send mixed signals to Google.
Submit your sitemap in Google Search Console if you haven’t already. Check the coverage report to see which submitted URLs have been indexed and which haven’t. The gap between submitted and indexed is often where the real problems are hiding.
Index coverage report
In Google Search Console, go to Pages (previously Coverage). Review the “Not indexed” reasons carefully. “Crawled, currently not indexed” is the one that should concern you most. It means Google has seen the page and decided not to include it, which is usually a signal of thin content, low quality, or a page that doesn’t add enough to the index to justify inclusion.
For a detailed technical walkthrough of crawl issues specifically related to JavaScript rendering, the Moz guide to auditing JavaScript SEO is worth reading if your site relies heavily on client-side rendering.
Crawl budget considerations
Crawl budget matters most for large sites, typically those with more than 10,000 URLs. If you’re running an e-commerce site with faceted navigation or a news site with a large archive, Googlebot may not be crawling your most important pages frequently enough. Use your server logs to see which pages Google is actually visiting and how often. Most marketing teams never look at server logs. They should.
Section 2: Technical SEO Foundations
Technical SEO covers the infrastructure your site runs on. These aren’t ranking factors in the direct sense, but they’re the conditions under which ranking factors operate. Get them wrong and you’re fighting with one hand tied behind your back.
Site speed and Core Web Vitals
Run your key pages through Google PageSpeed Insights and review your Core Web Vitals data in Search Console. Focus on Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift. These are the three metrics Google uses in its page experience signals.
Don’t chase a perfect score. Chase a score that’s competitive with the pages currently ranking above you for your target terms. I’ve seen clients spend months on performance optimisation to go from 72 to 89 on PageSpeed Insights while their competitors are ranking comfortably at 65. The relative score matters more than the absolute one.
HTTPS and security
Confirm the entire site is served over HTTPS. Check for mixed content warnings, where secure pages load insecure resources. These are common after migrations and can cause browser warnings that damage trust and conversion rates, not just SEO.
Mobile usability
Google indexes the mobile version of your site first. Run the mobile usability report in Search Console and fix any flagged issues. Common problems include text that’s too small to read, clickable elements too close together, and content wider than the screen.
URL structure and parameter handling
URLs should be clean, descriptive, and consistent. Check for: session IDs being appended to URLs, tracking parameters being indexed, and inconsistent trailing slash usage. Any of these can create duplicate content issues or waste crawl budget on URLs that add no value.
The Semrush technical SEO audit guide covers parameter handling in more depth if you’re working with a complex URL structure or an e-commerce platform with dynamic filtering.
Redirect audit
Map all your redirects. Look for redirect chains (A redirects to B redirects to C), redirect loops, and 302 redirects that should be 301s. Every redirect in a chain bleeds a small amount of link equity. More importantly, long chains slow down crawling and can cause Googlebot to give up before reaching the final destination.
Structured data
Check whether you have schema markup implemented and whether it’s valid. Use Google’s Rich Results Test. Structured data doesn’t directly improve rankings, but it can improve how your pages appear in search results, which affects click-through rate. For content-heavy sites, Article, FAQ, and HowTo schema are the most commonly applicable types.
Section 3: On-Page SEO Audit
On-page SEO is the area most marketers feel most comfortable with, which is probably why it’s also the area where the most time gets wasted on low-impact work. Meta description length and keyword density are not the problems. Structural issues and misaligned intent are.
Title tags
Audit every page for: missing title tags, duplicate title tags, and title tags that are over 60 characters (where they’ll be truncated in search results). More importantly, check that your title tags match the search intent of the queries you’re targeting. A title tag that reads like internal naming convention rather than a user’s search query is a missed opportunity.
Google rewrites title tags it considers a poor match for the query. If you’re seeing your titles being rewritten frequently in Search Console, that’s a signal that your titles aren’t serving users as well as Google thinks they should be.
Header structure
Each page should have one H1 that clearly communicates what the page is about. H2s and H3s should create a logical hierarchy that helps both users and crawlers understand the content structure. Run a crawl and flag pages with: no H1, multiple H1s, or H1s that don’t include the primary keyword or a close variant.
Keyword cannibalism
This is one of the most common and most damaging issues on sites that have been publishing content for several years. Keyword cannibalism happens when multiple pages on your site target the same or very similar queries. Google has to choose which page to rank, and it often gets it wrong, or ranks neither page as well as a single consolidated page would rank.
Run a site search in Google for your core target terms. If you see multiple pages from your own domain competing for the same query, you have a cannibalism problem. The fix is usually to consolidate the weaker pages into the stronger one and redirect, or to clearly differentiate the intent each page serves.
I ran an audit for a B2B software client a few years ago and found 14 pages all loosely targeting variations of the same product category term. None of them ranked in the top 20. We consolidated them into three clearly differentiated pages. Within four months, two of those pages were ranking on page one. The content itself hadn’t changed significantly. The structure had.
Internal linking
Internal links distribute authority around your site and help Google understand which pages are most important. Audit your internal link structure for: orphaned pages (pages with no internal links pointing to them), pages with very few internal links, and anchor text that’s too generic (“click here”, “read more”) to pass contextual signals.
Your most important pages should have the most internal links pointing to them. If that’s not the case, your internal link structure is working against you.
Image optimisation
Check that images have descriptive alt text, are served in modern formats (WebP where possible), and are appropriately sized. Large uncompressed images are one of the most common causes of slow page load times, particularly on mobile.
Section 4: Content Quality Audit
Content quality is harder to audit than technical issues because it requires judgement, not just a crawl report. But it’s often where the biggest gains are. Google has become increasingly good at distinguishing content that genuinely serves users from content that’s been written to rank.
Thin content
Thin content isn’t just about word count. A 2,000-word page can be thin if it says nothing of substance. But word count is a useful proxy. Flag any pages under 300 words that aren’t intentionally short (contact pages, landing pages with specific conversion goals). For blog content and informational pages, anything under 600 words deserves scrutiny.
Ask the honest question: if a user landed on this page from a Google search, would they find a satisfying answer? If the answer is no, the page either needs to be substantially improved or consolidated into a page that does answer the question properly.
Content freshness
Some queries are highly sensitive to freshness. “Best project management software” written in 2019 is unlikely to serve a user well in 2026. Audit your content for pages that rank for time-sensitive queries but haven’t been updated recently. Refreshing these pages, updating statistics, replacing outdated examples, and adding new sections, can recover lost rankings without starting from scratch.
Duplicate content
Duplicate content dilutes your ranking signals. Run a crawl to identify pages with identical or near-identical content. Common sources include: product pages with multiple URL variants, print-friendly page versions, HTTP and HTTPS versions both being indexed, and boilerplate text that appears across large numbers of pages.
Canonical tags are the standard solution for most duplicate content issues. Make sure yours are implemented correctly and are pointing to the right pages. A canonical pointing to a noindexed page is a mistake I’ve seen more times than I care to count.
Search intent alignment
For each page you’re trying to rank, search the target query in an incognito browser and look at what’s currently ranking. What format is it in? What depth does it go to? What questions does it answer? If the top-ranking content is all listicles and your page is a long-form essay, you’re misaligned with what Google has determined users want for that query. Format alignment matters as much as content quality.
Section 5: Backlink Profile Audit
Your backlink profile is a record of who has linked to your site and why. A strong profile accelerates ranking gains. A toxic one can trigger manual penalties or algorithmic suppression. Most sites sit somewhere in between, with a mix of genuinely valuable links, low-quality links that don’t help or hurt much, and a small number of links that create real risk.
Link quality assessment
Pull your backlink profile from Ahrefs, Semrush, or Majestic. Look at the referring domains rather than the raw link count. What’s the domain authority distribution? Are most of your links coming from low-quality directories, spun content sites, or irrelevant foreign-language domains? Or do you have a solid base of links from relevant, authoritative sources in your industry?
The Semrush off-page SEO checklist covers link quality assessment in detail, including how to identify link patterns that correlate with algorithmic risk.
Toxic link identification
Flag links from: link farms, private blog networks, sites with no discernible topical relevance, sites that have been penalised, and sites with manipulative anchor text patterns pointing to yours. The disavow tool exists for a reason, but use it carefully. Disavowing legitimate links is a mistake that’s hard to reverse.
Most sites don’t need to disavow anything. The links that look bad are usually just irrelevant rather than actively harmful. Save the disavow file for links that are clearly part of a manipulative scheme.
Anchor text distribution
An unnatural anchor text profile is a red flag. If 60% of your inbound links use the exact same keyword-rich anchor text, that’s a pattern that looks manipulated. A natural profile has a mix of branded anchors, generic anchors (“here”, “this article”), partial match anchors, and some exact match anchors. If yours skews heavily toward exact match, that’s worth noting even if it hasn’t triggered a penalty yet.
Competitor link gap analysis
Look at who is linking to your top competitors but not to you. These are your highest-probability link acquisition targets because the sites have already demonstrated a willingness to link to content like yours. This is more productive than cold outreach to sites with no prior context.
Section 6: Local SEO Audit (If Applicable)
If your business has a physical location or serves customers in specific geographic areas, local SEO is a distinct set of signals that deserves its own audit section. Ignore it if it’s not relevant to your business model. Don’t ignore it if it is.
Google Business Profile
Claim and verify your Google Business Profile if you haven’t. Check that your name, address, and phone number are accurate and consistent with what appears on your website. Inconsistencies in NAP data across the web create confusion for both users and Google.
Review your category selections. Most businesses underuse secondary categories. Check that your business description includes relevant terms naturally. Look at your review volume and recency. Reviews are a significant local ranking signal.
Citation consistency
Citations are mentions of your business name, address, and phone number on other websites, typically directories like Yelp, Yell, or industry-specific listings. Audit your citations for consistency. Variations in how your address is formatted, whether you use a suite number, how your business name is listed, can create signals that work against your local rankings.
Location pages
If you serve multiple locations, each location should have a dedicated page with unique content. Thin location pages that differ only in the city name are a common mistake and a waste of crawl budget. Each page should genuinely reflect what makes that location or service area distinct.
Section 7: User Experience and Engagement Signals
Google has never confirmed that it uses engagement metrics like bounce rate or time on page as direct ranking signals. But there’s a logical relationship between pages that users find genuinely useful and pages that rank well. The correlation is strong enough that it’s worth auditing.
Click-through rate from search
In Search Console, filter by query and look at your click-through rates. Pages that rank in positions 3 to 5 but have CTRs well below what you’d expect for those positions have a title tag or meta description problem. The page is visible but not compelling enough to click. This is often a quicker win than improving rankings, because the fix is a copy change, not a content overhaul.
Bounce rate and engagement
High bounce rates on informational content aren’t necessarily a problem. If a user reads your article, gets the answer they needed, and leaves, that’s a successful interaction. The metric that matters more is whether users who arrive from organic search are engaging with your site in ways that indicate value: scrolling, clicking through to other pages, converting.
Look for pages with high organic traffic but unusually low engagement. These are candidates for content improvement or intent realignment.
Navigation and site architecture
A flat site architecture, where important pages are reachable within three clicks from the homepage, is generally better for both users and crawlers. Audit your site depth. Pages buried six or seven levels deep in your architecture are less likely to be crawled frequently and less likely to accumulate internal link equity.
The HubSpot SEO audit overview covers site architecture considerations in the context of broader audit methodology, which is useful for teams running their first structured audit.
Section 8: Measurement and Reporting Setup
An audit that produces fixes without a measurement framework is incomplete. You need to know whether the changes you’ve made are working, and you need to be able to attribute improvements to specific actions rather than general trends.
Google Search Console configuration
Confirm that Search Console is set up correctly: the right property is verified, all domain variants are covered (www and non-www, HTTP and HTTPS), and the correct preferred domain is set. Check that email alerts are enabled for manual actions and coverage issues.
Analytics integration
Confirm that your analytics platform is correctly tracking organic traffic separately from other channels. Check that internal traffic is filtered out. Verify that goal or conversion tracking is configured for the outcomes that matter to your business, not just page views.
I’ve seen surprisingly large businesses with no conversion tracking on their organic channel. They know how much traffic SEO is driving. They have no idea how much revenue it’s generating. That’s not an analytics problem. It’s a business decision problem, and it’s one that consistently leads to SEO being underfunded because the commercial case can’t be made clearly.
Rank tracking setup
Set up rank tracking for your core target keywords before you start making changes. This gives you a baseline to measure against. Track at the keyword level, not just the aggregate traffic level, so you can see which specific changes are producing which specific results.
Be cautious about reading too much into short-term rank movements. Google tests rankings before settling on them. A page that drops three positions the week after you make changes may recover and improve over the following month. Weekly snapshots are useful for monitoring. Monthly trends are what you should be drawing conclusions from.
The Moz analysis of failed SEO tests is worth reading for any team that’s about to start making changes based on audit findings. Understanding why SEO tests fail is as useful as understanding what to test.
How to Prioritise Your Audit Findings
Once you’ve run through the checklist, you’ll have a list of issues. The question is what order to tackle them in. Here’s the framework I use.
Fix crawlability and indexation issues first. If Google can’t access your content, nothing else matters. Blocked pages, broken sitemaps, and incorrect noindex tags take priority over everything.
Address critical technical issues second. Redirect chains, mixed content warnings, and significant Core Web Vitals failures belong in this tier. They’re not as immediately blocking as crawlability issues, but they create drag on everything else.
Tackle content quality and cannibalism third. These fixes take longer but often produce the most significant ranking improvements, particularly on sites with a substantial content archive. Consolidating cannibalised content and improving thin pages are consistently high-return activities.
Address on-page optimisation fourth. Title tags, header structure, and internal linking are worth improving, but they’re rarely the primary reason a site is underperforming. They’re more likely to be the difference between position 8 and position 4 than the difference between page 5 and page 1.
Backlink work runs in parallel. Link acquisition is a long-term activity that doesn’t fit neatly into a sequential audit workflow. Start the competitor gap analysis early, but don’t expect link building to produce results in weeks. It rarely does.
The Optimizely SEO checklist offers a useful cross-reference for teams that want to validate their prioritisation approach against an independent framework.
What a Good Audit Output Looks Like
A good audit output is not a 200-line spreadsheet of every issue the crawler found. It’s a prioritised action plan with three columns: the issue, the expected impact, and the effort required to fix it. High impact, low effort items go at the top. Low impact, high effort items go to the backlog or get dropped entirely.
Each action should have an owner, a deadline, and a success metric. “Fix thin content” is not an action. “Update the 12 product category pages flagged as thin in the crawl, adding at least 400 words of genuinely useful content to each, with a target of moving from ‘crawled, not indexed’ to indexed within 60 days” is an action.
Early in my agency career, I inherited a client who’d had three SEO audits from three different agencies in two years. None of them had produced meaningful improvements. When I looked at the audit reports, they were technically thorough and commercially useless. Pages of findings with no prioritisation and no accountability. The fourth audit, which we ran, was half the length and produced twice the results because we committed to fixing 12 specific things rather than cataloguing 200 possible problems.
That experience shaped how I think about audits. The deliverable is not the report. The deliverable is the improvement in organic performance that follows from acting on the report. Everything else is theatre.
If you’re using this checklist as part of a broader SEO programme rather than a standalone exercise, the Complete SEO Strategy hub connects audit findings to the keyword research, content, and link building work that turns audit insights into sustained organic growth.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
