SEO Auditing: What the Numbers Are Telling You

An SEO audit is a structured review of the factors that determine how well a website performs in organic search. Done properly, it identifies the technical, content, and authority issues that are suppressing rankings, and gives you a prioritised list of what to fix and in what order. Done poorly, it produces a 200-line spreadsheet that nobody acts on.

The difference between those two outcomes is almost never the tool you used. It is whether the person running the audit knows how to interpret what they are looking at, and whether they have the commercial judgment to separate urgent from cosmetic.

Key Takeaways

  • An SEO audit is only as useful as the prioritisation that follows it. A long list of issues without business context is just noise.
  • Most audit tools flag the same surface-level problems. The skill is in understanding which problems are actually costing you traffic and revenue.
  • Technical health, content quality, and link authority must be assessed together. Fixing one in isolation rarely moves the needle.
  • Crawl data and Google Search Console tell different stories. You need both, and you need to understand where they contradict each other.
  • A good audit ends with a ranked action plan tied to business impact, not a checklist of SEO hygiene tasks.

Why Most SEO Audits Produce No Change

I have been on the receiving end of SEO audits from some well-regarded agencies, and I have commissioned them as a client and delivered them as an agency CEO. The pattern that kills most of them is the same: the audit is treated as the deliverable, rather than the diagnosis. The agency hands over a document, the client files it, and nothing changes.

Part of this is structural. Audits are often sold as one-off projects, which means the agency has no stake in what happens after delivery. Part of it is that audit outputs are rarely translated into business language. A developer does not care that your crawl depth is too high. They care that Google cannot reach 40% of your product pages, which means those pages cannot rank, which means the paid search team is picking up the slack at significant cost per click.

The other problem is false precision. I have seen audits that score a site 67 out of 100 on some proprietary rubric, as if that number means anything. It does not. It is a tool vendor’s approximation dressed up as measurement. What you need from an audit is an honest read of where your site stands, what is genuinely broken, and what the realistic upside is if you fix it. That is harder to produce than a score, but it is the only version that drives action.

If you want the broader strategic context for where auditing fits into your organic search programme, the complete SEO strategy hub at The Marketing Juice covers the full picture, from technical foundations through to content and authority building.

The Three Layers Every Audit Must Cover

A complete SEO audit has three distinct layers, and they interact with each other. Treating them as separate exercises is one of the more common mistakes I see, particularly in audits run by technical specialists who are less comfortable with content, or content strategists who gloss over the crawl data.

Technical health is the foundation. It covers how well search engines can access, crawl, render, and index your content. Issues here include crawl errors, redirect chains, duplicate content, slow page speed, broken internal linking, and problems with how JavaScript is handled. None of these are glamorous, but they are the reason a site with strong content and decent links can still underperform. If Google cannot reliably crawl your site, everything else is academic.

Content quality is the second layer. This is not just about whether your pages are well-written. It is about whether each page has a clear purpose, whether it matches the intent behind the queries you are targeting, and whether there is unnecessary duplication or cannibalisation between pages competing for the same terms. I have worked with e-commerce clients who had hundreds of near-identical category pages, each slightly different in title and meta description, all competing with each other and none of them ranking well as a result.

Authority and link profile is the third layer. This covers the quality and relevance of external sites linking to you, the distribution of that link equity across your site, and whether you have any toxic or spammy links that could be suppressing performance. The role of links in ranking has been debated endlessly, but after managing hundreds of millions in ad spend across 30 industries and watching organic performance alongside paid, I can tell you that sites with strong, relevant link profiles consistently outperform those without them, all else being equal.

Technical Audit: What to Actually Look For

Start with crawl coverage. Use a tool like Screaming Frog or Sitebulb to crawl your site, then compare the pages discovered against your XML sitemap and against what Google Search Console reports as indexed. Discrepancies between these three sources tell you something important. If your crawler finds 3,000 pages but Google has only indexed 1,800, you have either a crawl budget problem, a content quality problem, or both.

Redirect chains deserve more attention than they typically get. Every redirect adds latency and dilutes the link equity passing through it. A site that has been through multiple redesigns, CMS migrations, or domain changes often has redirect chains four or five hops long. That is not a theoretical problem. I have seen sites where a significant portion of inbound link equity was being lost to redirect chains that had never been cleaned up after a platform migration.

Page speed matters, but not in the way most audit templates present it. A PageSpeed Insights score of 62 on mobile is not inherently a crisis. What matters is whether your Core Web Vitals are in the ranges that affect ranking, and whether your speed issues are concentrated on high-value pages or spread across the site. A slow homepage is less damaging than slow product or service pages where purchase intent is highest.

Internal linking is consistently undervalued in technical audits. How you link between pages signals to Google which pages you consider most important, and it distributes crawl budget and link equity across the site. Orphaned pages, pages with only one internal link, and pages buried four or five clicks from the homepage are all problems that are straightforward to fix and often have a meaningful impact on crawl coverage and rankings for deeper content.

On the topic of URLs, the structure matters less than many older SEO guides suggest. Google’s guidance on keywords in URLs, which Search Engine Land covered in detail from Matt Cutts, has always been that URL keywords carry some weight but are a minor signal. Clean, logical URL structures are worth having for usability and for the small SEO benefit they provide, but they are not worth a major restructuring project unless your current URLs are actively problematic.

Content Audit: Separating Signal From Noise

A content audit is not a word count exercise. The question is not whether pages are long enough. The question is whether each page is earning its place in the index and serving a purpose that is distinct from every other page on the site.

Start by pulling your Google Search Console data and sorting pages by impressions and clicks over the last twelve months. This gives you a performance baseline. Then segment your pages into three buckets: pages that are performing well and should be protected, pages that are underperforming relative to their potential and should be improved, and pages that are generating almost no impressions and have no clear strategic purpose.

That third bucket is where most of the difficult decisions live. Thin content pages, outdated posts, near-duplicate category pages, and pages that were created for reasons that no longer apply to the business all sit here. The instinct is to keep everything, because removing content feels risky. But a large index of low-quality pages can suppress the overall perceived quality of a site in ways that affect your stronger pages too. Pruning is a legitimate strategy, not a last resort.

Cannibalisation is the other major content issue worth examining carefully. When two or more pages on your site are targeting the same or very similar queries, they compete with each other rather than reinforcing each other. Google has to choose which one to show, and it often chooses neither particularly well. I have worked on sites where consolidating three competing blog posts into one authoritative page produced a ranking improvement within weeks, simply because the signal was no longer being split.

Getting budget approved to act on content audit findings is often harder than the audit itself. If you are making the case internally, Moz has a useful piece on getting SEO investment approved that covers how to frame the business case in terms that resonate with finance and leadership teams.

The link audit is where I see the most confusion between activity and quality. A site with 50,000 backlinks is not necessarily in better shape than one with 5,000. The questions that matter are: how many of those links come from genuinely relevant, authoritative sites? How many are from domains that appear to exist solely to sell links? And how is link equity distributed across the site, or is it all concentrated on the homepage?

Use Ahrefs or Semrush to pull your full backlink profile, then filter by referring domain rather than total links. One domain linking to you 400 times is not the same as 400 domains each linking once. Referring domain count is the more meaningful metric, and within that, you want to understand the distribution by domain authority and by topical relevance.

Toxic link assessment has become somewhat overstated as a concern since Google has said it is quite good at ignoring low-quality links. But if you have a history of aggressive link building, a manual penalty, or a site that has changed hands, it is worth reviewing your disavow file and checking whether there are patterns of links that look manipulative. The disavow tool is not something to use casually, but it is there for situations where you have clear evidence of a problematic link profile.

Competitor link gap analysis is the most commercially useful part of a link audit. Pull the link profiles of your top three to five organic competitors, identify the domains linking to them but not to you, and assess whether those are realistic targets for your own link acquisition. This turns the audit from a backward-looking exercise into a forward-looking acquisition plan.

How to Use Google Search Console Properly in an Audit

Google Search Console is the most underused tool in most SEO audits, which is remarkable given that it is the one data source that comes directly from Google. Most people look at the Performance report, note their top queries and pages, and move on. That is leaving most of the value on the table.

The Coverage report is where the technical audit starts. It shows you which pages Google has indexed, which it has excluded and why, and which it has attempted to crawl but encountered errors on. The “Excluded” section is particularly revealing. Pages excluded because of a noindex tag are expected. Pages excluded because Google chose not to index them despite no directive telling it not to are a signal that Google does not consider that content worth indexing, which is a content quality issue, not a technical one.

The Core Web Vitals report shows you field data, meaning real user experience data collected from Chrome users, rather than the lab data you get from PageSpeed Insights. Field data and lab data often tell different stories. A page can score well in a lab test and still have poor field CWV if real users on slower connections or devices are having a worse experience. Always prioritise field data when making decisions about performance investment.

The Links report shows you which pages are receiving the most internal links and which external domains are linking to you most frequently. Cross-reference this with your crawl data. If your most important pages are not receiving proportionate internal links, that is a straightforward fix with a meaningful impact on how crawl budget is allocated.

One thing I always check in an audit is the query data for pages that are ranking in positions 5 through 20. These are pages that are close to generating meaningful traffic but are not quite there. Understanding which queries those pages are appearing for, and whether the content is genuinely addressing those queries or just tangentially related to them, often reveals quick wins that a purely technical audit would miss entirely.

Prioritising Audit Findings: The Commercial Lens

This is the part that separates a useful audit from an expensive document. Every audit produces more issues than any team can realistically address in a reasonable timeframe. Prioritisation is not optional. It is the most important output of the whole exercise.

I use a simple framework when working through audit findings with clients. For each issue, I ask three questions. First, how many pages does this affect? An issue affecting 2,000 pages is structurally more important than one affecting 20. Second, what is the estimated traffic or revenue impact if this is fixed? This requires some judgment and honest approximation rather than precise calculation, but it forces you to think in business terms rather than SEO terms. Third, how complex is the fix? Some issues that affect many pages are actually straightforward to resolve at the template level. Others require significant development work and need to be weighed against competing priorities.

The output should be a tiered action plan: things to fix in the next 30 days, things to address in the next quarter, and things to monitor but not prioritise. That is a document a CEO or CFO can engage with. A 200-line spreadsheet of SEO issues with severity ratings of high, medium, and low is not.

When I was growing the agency from 20 to over 100 people, one of the things I learned about internal credibility is that it comes from being specific about what you expect to happen and why. Vague promises of “improved organic performance” do not survive contact with a quarterly business review. An audit that says “fixing these three redirect chains is estimated to restore crawl access to 400 product pages that are currently not indexed, which should improve organic revenue from those categories by a meaningful margin over the next six months” is a document that gets acted on.

The broader SEO strategy context matters here too. An audit finding only makes sense if you know where it fits in your overall organic programme. The complete SEO strategy framework on The Marketing Juice covers how technical health, content, and authority building connect into a coherent programme rather than a series of disconnected projects.

How Often Should You Run an SEO Audit

The honest answer is that a full audit is not something most sites need to run on a fixed schedule. It is something you need when circumstances change: a site migration, a significant algorithm update, a meaningful drop in organic traffic, or a strategic shift in the business that changes which pages and queries matter most.

What you do need on a continuous basis is monitoring. Google Search Console alerts, crawl monitoring tools set to flag new errors, and regular checks on your Core Web Vitals data will catch most issues before they become significant problems. The full audit is the deep-dive that gives you the baseline and the strategic direction. Monitoring is what keeps you from drifting away from that baseline between audits.

For larger sites, particularly e-commerce or publishing sites with thousands of pages and frequent content changes, a quarterly review of key audit metrics makes sense. For smaller sites with more stable content, an annual audit combined with ongoing monitoring is usually sufficient. The frequency should be proportionate to the rate of change on the site and the competitive intensity of the market you are in.

One thing I would caution against is running audits as a response to every algorithm update. Google updates its algorithm constantly, and most updates do not require a full audit response. If your traffic drops sharply following a confirmed major update, that is worth investigating. But reflexively auditing the site after every search news cycle is a way to generate activity without generating progress.

The Measurement Problem in SEO Auditing

One of the things that frustrates me about how audits are typically presented is the false precision embedded in the scoring systems. Tool vendors have a commercial interest in making their diagnostic outputs look authoritative, so they produce scores, grades, and benchmarks that imply a level of certainty that the underlying data does not support.

Google does not tell you why your site ranks where it does. It does not confirm which of your technical issues it considers significant. The signals we measure are proxies for what Google is actually evaluating, and those proxies are imperfect. An honest audit acknowledges this. It says “we believe these issues are suppressing performance based on the patterns we can observe” rather than “your site scored 58 out of 100 on technical health.”

This is not an argument for not measuring. It is an argument for measuring honestly. The best practitioners I have worked with over two decades are the ones who can hold two things at once: a rigorous, data-driven approach to diagnosis, and genuine intellectual humility about the limits of what the data can tell them. That combination produces better decisions than either pure data faith or pure instinct.

When I judged the Effie Awards, the entries that stood out were not the ones with the most sophisticated measurement frameworks. They were the ones where the team clearly understood what they were measuring, why it mattered, and what its limitations were. The same principle applies to SEO auditing. Know what your tools are telling you, know what they cannot tell you, and make decisions accordingly.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What is an SEO audit and what does it cover?
An SEO audit is a structured review of the factors affecting a website’s organic search performance. A complete audit covers three areas: technical health (crawlability, indexation, page speed, internal linking), content quality (relevance, duplication, cannibalisation, intent alignment), and link authority (quality and distribution of external backlinks). Each layer affects the others, which is why auditing them in isolation tends to produce incomplete findings.
How long does an SEO audit take?
For a small to medium-sized site with under 1,000 pages, a thorough audit typically takes between one and two weeks, including crawl analysis, Search Console review, content assessment, and link profile analysis. Larger sites with tens of thousands of pages can take four to six weeks for a complete audit. The timeline depends heavily on how well-organised the site’s data is and how much historical context is available from previous audits or analytics configurations.
What tools are needed to run an SEO audit?
The core toolkit for a professional SEO audit includes a crawler (Screaming Frog or Sitebulb), a backlink analysis tool (Ahrefs or Semrush), and Google Search Console. Google Analytics or a comparable analytics platform provides the traffic and conversion context that turns technical findings into business-relevant priorities. PageSpeed Insights or the Chrome User Experience Report adds Core Web Vitals data. No single tool covers everything, and the skill is in reconciling what different tools report rather than relying on any one source.
How do you prioritise SEO audit findings?
Prioritisation should be based on three factors: the number of pages affected, the estimated business impact if the issue is resolved, and the complexity of the fix. Issues affecting hundreds of pages that can be resolved at the template level rank higher than issues affecting a handful of pages that require significant development work. The output should be a tiered action plan with clear timelines, not a flat list of issues sorted by a tool’s severity rating.
How often should an SEO audit be conducted?
A full audit is most valuable when triggered by a specific event: a site migration, a significant drop in organic traffic, a major algorithm update, or a strategic change in the business. Between audits, ongoing monitoring through Google Search Console and crawl alerting tools is sufficient to catch emerging issues. For large sites with frequent content changes, a quarterly review of key audit metrics makes sense. For smaller, more stable sites, an annual audit combined with continuous monitoring is typically adequate.

Similar Posts