Website Audit: What’s Actually Broken and How to Fix It

A website audit is a systematic examination of your site’s technical health, on-page optimisation, and content quality to identify what is preventing it from ranking, converting, or performing at the level it should. Done properly, it gives you a prioritised list of problems worth solving, not a 200-line spreadsheet of things that technically could be improved.

Most audits fail not because the tools are wrong, but because the person running them doesn’t know which findings matter commercially. This article breaks down how to run one that produces useful decisions, not just data.

Key Takeaways

  • A website audit is only as useful as the prioritisation that follows it. Fixing 200 low-impact issues while ignoring three critical ones is a common and expensive mistake.
  • Technical, on-page, and content audits are distinct workstreams. Running them together without separating outputs creates confusion about what to fix first.
  • Crawlability and indexability are not the same thing. A page can be crawlable but deliberately excluded from the index, and vice versa. Conflating the two leads to the wrong fixes.
  • Core Web Vitals are a ranking signal, but their commercial impact varies significantly by industry and audience. Treat them as context, not a universal emergency.
  • The most valuable output of any audit is a ranked action list tied to business impact, not a technical report that lives in a shared drive.

Early in my career, around 2000, I asked the MD of the agency I was working at for budget to rebuild our website. The answer was no. So I went away, taught myself to code, and built it myself over a few weekends. What that experience gave me, beyond a functional site, was a working understanding of how websites are actually constructed. That foundation has been more useful in client conversations and audits than any tool I’ve used since. You don’t need to be able to code to run a good audit. But you do need to understand what you’re looking at when the data comes back.

What Is a Website Audit and What Should It Actually Produce?

The term “website audit” gets used loosely. Some people mean a technical crawl. Others mean a content review. Some agencies use it as a sales tool, producing a long report full of amber warnings to justify a retainer. None of those things are inherently wrong, but they are different, and conflating them produces confusion.

A proper website audit has three distinct components: technical health, on-page optimisation, and content quality. Each feeds into the others, but each requires a different lens. Technical issues can prevent good content from ranking. Thin content can undermine an otherwise clean technical setup. On-page problems can dilute the signal even when the underlying content is strong.

The output that matters is not a report. It is a ranked action list. Which problems are costing you the most? Which fixes will have the greatest impact relative to the effort required? That prioritisation is where most audits fall down, and it is where commercial judgement matters more than technical knowledge.

If you are building or refining a broader SEO programme, the audit sits at the foundation. The Complete SEO Strategy Hub covers the full picture, from technical groundwork through to content and measurement. The audit is where you establish what you are working with before deciding what to build.

Technical Audit: The Infrastructure Layer

Technical SEO is the part of the audit most people start with, and for good reason. If your site cannot be crawled and indexed correctly, nothing else matters. But technical audits also have the highest signal-to-noise ratio of any audit type. The tools will surface hundreds of issues. Most of them are not worth your time.

Crawlability and Indexability

These are not the same thing, and treating them as interchangeable is one of the more common mistakes I see. Crawlability refers to whether Googlebot can access and follow a page. Indexability refers to whether Google can include that page in its index. A page can be crawlable but blocked from the index via a noindex tag. A page can be indexable in theory but practically invisible if it has no internal links pointing to it.

Semrush’s breakdown of crawlability and indexability is a solid reference if you want the technical definitions in detail. For audit purposes, the practical check is: are the pages you want indexed actually being indexed, and are any pages you do not want indexed being accidentally crawled and wasting your crawl budget?

Start with Google Search Console. The Coverage report will show you which pages are indexed, which are excluded, and why. Cross-reference that against your sitemap. If pages you care about are sitting in the “Excluded” category with reasons like “Crawled, currently not indexed” or “Discovered, currently not indexed”, that is a signal worth investigating before you touch anything else.

Site Architecture and Internal Linking

How your site is structured affects both crawlability and PageRank distribution. Pages buried four or five clicks from the homepage receive less crawl attention and less internal link equity. For most sites, nothing important should be more than three clicks from the homepage.

Run a crawl using Screaming Frog, Ahrefs Site Audit, or Semrush’s technical audit tool. Look at crawl depth distribution. If you have commercially important pages sitting at depth four or five, that is an architecture problem, not a content problem. The fix is restructuring your internal linking, not rewriting the page.

Orphan pages are a related issue. These are pages that exist on your site but have no internal links pointing to them. They may be indexed if they appear in your sitemap, but they receive no internal equity and are often forgotten content that is diluting your overall quality signal. Identify them, decide whether they are worth keeping, and either link to them properly or remove them.

Page Speed and Core Web Vitals

Core Web Vitals became a confirmed Google ranking signal in 2021. The three metrics, Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift, measure loading performance, interactivity, and visual stability respectively. Google’s PageSpeed Insights and the Core Web Vitals report in Search Console are your primary data sources here.

A word of caution on prioritisation. I have worked with clients who spent months chasing Core Web Vitals improvements while their content and link profile were the actual limiting factors. For most sites, the ranking impact of moving from “Needs Improvement” to “Good” on CWV is real but modest. If your LCP is sitting at 4.5 seconds, fix it. If it is at 2.7 seconds and you are chasing 2.4, there are almost certainly better places to spend your time.

The Ahrefs advanced site audit webinar covers technical prioritisation in depth and is worth watching if you are running audits at scale.

JavaScript Rendering

If your site relies heavily on JavaScript to render content, your audit needs to account for how Googlebot handles that rendering. Google can render JavaScript, but it does so in a second wave, and there are known issues with certain frameworks and implementations. Content that requires JavaScript to display may not be indexed in the way you expect.

The Moz guide to auditing JavaScript SEO is the most thorough treatment of this I have seen. If your site is built on a JavaScript-heavy framework, this is worth reading before you draw conclusions from your crawl data.

HTTPS, Redirects, and Canonicalisation

These are table stakes. Your site should be fully on HTTPS with no mixed content warnings. Your redirect chains should be clean. Ideally, every redirect is a single hop, not a chain of three or four. And your canonical tags should be consistent with your intended URL structure.

Canonical tag issues are surprisingly common. I have audited sites where the canonical tag on a page pointed to a different URL than the one actually ranking, which creates confusion about which version Google should treat as authoritative. Check that your canonicals are self-referencing where appropriate, and that you are not accidentally canonicalising paginated content to the first page in a series.

On-Page Audit: The Signal Layer

Once you have established that your site can be crawled and indexed correctly, the on-page audit looks at whether the signals on each page are clearly communicating what that page is about and why it deserves to rank.

Title Tags and Meta Descriptions

Title tags remain one of the clearest on-page signals available to you. They should be unique, descriptive, and front-loaded with the primary keyword for that page. The common problems are: duplicate title tags across multiple pages, title tags that are too long and get truncated in SERPs, and title tags that do not reflect the actual search intent of the page.

Meta descriptions do not directly influence rankings, but they influence click-through rate, which has downstream effects. A crawl will surface missing or duplicate meta descriptions. The fix is straightforward, but write them to earn the click, not just to fill the field.

Heading Structure

One H1 per page. It should match or closely reflect the title tag and the primary search intent. H2s and H3s should structure the content logically, not just break up walls of text. Heading structure is both a user experience signal and a relevance signal. If your H2s are not telling Google what the subsections of your page are about, you are leaving context on the table.

The relationship between heading structure, content design, and SEO is covered well in HubSpot’s article on SEO and web design. The short version: structure is not decoration.

Keyword Alignment and Search Intent

This is where the on-page audit connects to your keyword strategy. For each important page, the question is: does this page match the search intent of the keyword it is targeting? A page targeting a transactional keyword should have a clear conversion path. A page targeting an informational keyword should answer the question comprehensively.

If you have not done thorough keyword research before running your on-page audit, the audit will surface symptoms without helping you understand the cause. Keyword research is the upstream work that makes on-page decisions coherent. Without it, you are guessing about intent.

Look at the pages that are ranking on page two or three for terms you care about. Often the issue is not technical. It is that the page is targeting a keyword but not satisfying the intent behind it. The user searching that term wants something the page is not delivering.

Duplicate Content

Duplicate content does not trigger a manual penalty in most cases, but it creates confusion about which version of a page Google should rank. Common sources include www versus non-www versions of URLs, HTTP versus HTTPS, trailing slash versus no trailing slash, and session parameters appended to URLs.

Your crawl tool will surface internal duplication. For external duplication, where your content appears on other sites, tools like Copyscape can identify the issue. The fix depends on the source. If it is a syndication arrangement, canonical tags pointing back to your original are the standard approach. If someone has scraped your content without permission, that is a different conversation.

Image Optimisation

Images slow pages down and miss relevance signals when they are not optimised. Check for missing alt text, oversized image files, and images that are not served in modern formats like WebP. Alt text is both an accessibility requirement and a relevance signal. It should describe what the image shows, not stuff keywords into every field.

Content Audit: The Substance Layer

The content audit is the part most people skip or treat as an afterthought. It is also, in my experience, where the most significant ranking improvements come from. Technical issues tend to have a ceiling: fix them and you remove blockers. Content improvements have an upside: get them right and you actively earn ranking and traffic.

Content Inventory and Quality Assessment

Start by pulling a complete list of your indexed pages. For each page, record the primary keyword target, the current ranking position, the organic traffic it receives, and the last time it was updated. This inventory is the foundation of your content audit.

Then segment the content into three buckets. Pages that are performing well and need maintenance. Pages that are underperforming relative to their potential and need improvement. Pages that are providing no value and should be consolidated or removed.

That third bucket is where most sites have the most to gain. Thin content, outdated posts, and near-duplicate articles targeting the same keyword are a drag on your overall quality signal. Google’s quality assessment of your site is not purely page-by-page. A site with a high proportion of low-quality pages is evaluated differently from one where the majority of content is genuinely useful.

Keyword Cannibalisation

Keyword cannibalisation happens when multiple pages on your site are competing for the same search term. Google has to choose which page to rank, and it may not choose the one you want. The symptom is typically a fluctuating ranking position for a term you care about, with different pages appearing in SERPs at different times.

The fix is consolidation. Identify the strongest page for that keyword, redirect the weaker ones to it, and ensure the surviving page is comprehensively covering the topic. This is not about removing content for the sake of it. It is about concentrating your signal rather than diluting it.

I worked with a financial services client a few years ago that had published 14 separate articles targeting variations of the same core keyword. Each one had a handful of links, some organic traffic, and its own thin treatment of the topic. We consolidated them into three well-structured pieces, redistributed the internal linking, and within four months two of those three pages had moved from page three to page one. The content had not fundamentally changed. The signal had been concentrated.

Content Freshness

For some topics, freshness is a significant ranking factor. News, current events, and rapidly evolving topics like regulation or technology benefit from regular updates. For evergreen topics, freshness matters less, but outdated information is a trust signal problem regardless of rankings.

Your content inventory should flag pages that have not been updated in more than 18 months and are targeting competitive or time-sensitive terms. A systematic refresh programme, updating statistics, adding new context, improving structure, is more efficient than producing new content when you have existing pages that already have some authority.

E-E-A-T Signals

Experience, Expertise, Authoritativeness, and Trustworthiness are not a checklist. They are a framework Google uses to assess whether content is produced by people who actually know what they are talking about. For your audit, the relevant questions are: does your content demonstrate genuine expertise, are authors identified and credible, does the site have clear signals of trustworthiness such as contact information, privacy policy, and about pages, and does your content cite credible external sources where appropriate?

This matters more in some verticals than others. Health, finance, and legal content are held to a higher standard because the consequences of bad information are more serious. But every site benefits from being explicit about who is behind the content and why they are qualified to produce it.

A full website audit should include a review of your backlink profile. Not because you need to obsess over every link, but because your link profile tells you things about your site’s authority and potential vulnerabilities that technical and content data cannot.

Link Profile Health

Pull your backlink data from Ahrefs or Semrush. Look at the distribution of linking domains, the quality of those domains, and the anchor text distribution. A healthy link profile has a natural mix of branded anchors, generic anchors, and keyword-rich anchors. An over-optimised anchor text profile, where a high proportion of links use exact-match keywords, is a risk signal.

Look for patterns in your referring domains. Are your links concentrated in a few low-quality directories? Are there spikes in link acquisition that coincide with previous ranking drops? These are not always problems, but they are worth understanding.

For sites that are actively building links, the audit should also assess whether your link acquisition strategy is producing links that actually move the needle. SEO outreach services can accelerate link acquisition, but the quality and relevance of those links matters far more than the volume.

Disavow File Review

If your site has an existing disavow file, review it as part of the audit. Disavow files can become outdated. Links that were disavowed years ago may now be from legitimate domains that have changed ownership. Conversely, new toxic links that have accumulated since the file was last updated may need to be added. This is not a monthly task, but it should be part of a thorough annual audit.

How Different Site Types Change the Audit Focus

A website audit is not a single fixed process. The priorities shift depending on what the site is trying to do and who it is trying to reach. Understanding that context before you start saves significant time.

B2B Sites

B2B sites typically have smaller organic traffic volumes but higher commercial value per visitor. The audit focus for a B2B site should weight content quality and search intent alignment heavily. A B2B buyer searching for a solution is not looking for a thin overview. They are looking for depth, credibility, and specificity.

If you are working with a B2B organisation and considering whether specialist SEO input is worth it, the B2B SEO consultant guide covers what to look for and what to expect from that engagement.

Local Service Businesses

For local businesses, the audit has an additional layer: local SEO signals. This includes Google Business Profile optimisation, NAP consistency across citations, local schema markup, and the presence of location-specific content on the site.

The technical and content fundamentals still apply, but for a local business, a perfectly optimised site that has not claimed and optimised its Google Business Profile is leaving significant visibility on the table. The local SEO framework for plumbers illustrates how these elements work together in a competitive local service market. The same principles apply across most local verticals.

For specialist professional practices, the audit needs to account for the trust signals specific to that sector. SEO for chiropractors is a useful case study in how technical, local, and content elements combine for a regulated health practice. The audit priorities for a chiropractic clinic differ from those for a plumber, even though both are local service businesses.

E-commerce Sites

E-commerce audits have their own specific challenges. Faceted navigation creates enormous numbers of URL variations that can overwhelm crawl budget and create duplicate content at scale. Product pages often have thin content because the product description is the same across multiple retailers. Category page optimisation is frequently neglected in favour of product-level work.

For e-commerce, the audit should specifically assess: crawl budget management, faceted navigation handling, structured data implementation for products and reviews, and the quality of category page content. These are the areas where e-commerce sites most commonly leave ranking potential unrealised.

Tools You Actually Need

There is no shortage of audit tools. The question is which ones are worth paying for and which ones are sufficient for the task at hand.

For a thorough technical audit, you need a crawl tool. Screaming Frog is the standard for most practitioners. The free version handles up to 500 URLs, which is sufficient for smaller sites. For larger sites, the paid version is worth it. Ahrefs Site Audit and Semrush’s technical audit tool are credible alternatives, particularly if you are already paying for those platforms.

Google Search Console is non-negotiable and free. It gives you data that no third-party tool can replicate because it comes directly from Google. Coverage reports, Core Web Vitals data, and the Performance report showing which queries are driving impressions and clicks should be the starting point for every audit.

Google Analytics or GA4 provides the user behaviour data that contextualises your technical findings. High bounce rates on specific pages, poor engagement metrics, and conversion funnel drop-offs are symptoms that the audit should explain, not just catalogue.

For backlink analysis, Ahrefs and Semrush are the two credible options. Their link databases differ, and neither is complete, but either gives you a working picture of your link profile. The Semrush technical audit guide is a useful reference for understanding how to interpret the outputs from their platform specifically.

For user experience and conversion signals, Crazy Egg’s guide to scoring your site’s SEO covers some of the qualitative dimensions that crawl tools miss. Heatmaps and session recordings can reveal usability problems that explain why technically sound pages are not converting.

How to Prioritise Audit Findings

This is where most audits fail. The tool produces a list. The list gets exported to a spreadsheet. The spreadsheet sits in a shared drive. Nothing happens.

When I ran the SEO division at iProspect, we managed audit processes across dozens of clients simultaneously. The teams that produced the most results were not the ones with the most thorough reports. They were the ones who could translate findings into a clear, ranked action list and then actually execute against it. The audit is a means to an end, not the end itself.

Prioritise findings using two axes: impact and effort. High impact, low effort fixes go first. These typically include fixing broken redirects, resolving indexation errors for important pages, updating outdated title tags on high-traffic pages, and consolidating cannibalising content.

High impact, high effort work, such as a site architecture restructure or a comprehensive content consolidation programme, needs to be sequenced into a roadmap with realistic timelines. It cannot all happen at once, and trying to do everything simultaneously produces nothing.

Low impact findings, and there will be many, should be documented and deprioritised. Not ignored permanently, but not allowed to consume time that should go to work that moves the needle. A missing meta description on a page that receives three visits a month is not where your energy should go.

The Moz Whiteboard Friday on SEO auditing success addresses this prioritisation challenge directly and is worth watching before you present findings to a client or internal stakeholder.

How Often Should You Run an Audit?

A full website audit is not a monthly task. For most sites, a comprehensive audit once a year is appropriate, with lighter quarterly checks on the technical fundamentals and content performance.

There are specific triggers that should prompt an unscheduled audit. A significant unexplained drop in organic traffic. A major site migration or redesign. A Google algorithm update that appears to have affected your rankings. A change in your primary keyword targets. Any of these events warrants a focused audit to understand what has changed and why.

Site migrations in particular are an area where audits are consistently underinvested. I have seen migrations that destroyed years of accumulated ranking authority because the redirect mapping was incomplete or the new site had technical issues that were not caught before launch. A pre-migration audit and a post-migration audit are not optional. They are the minimum.

What a Good Audit Report Looks Like

If you are producing an audit for a client or presenting findings internally, the format matters. A 50-page technical document is not a deliverable. It is a liability. Nobody reads it, nobody acts on it, and it creates the impression that the work is done when it has barely started.

A useful audit report has four sections. An executive summary that answers the question “what is the most important thing we need to fix and why?” A prioritised action list with clear owners and timelines. A technical findings section with enough detail for the developer or technical SEO to act on. And a content and on-page section with specific page-level recommendations.

The executive summary should be written for someone who will not read the rest of the document. It should be clear, specific, and commercial. “Your site has 847 technical issues” is not an executive summary. “Three pages that collectively account for 60% of your organic revenue have indexation problems that can be resolved in under a day” is.

Understanding how Google’s search engine processes and evaluates sites gives useful context for how to frame audit findings for non-technical stakeholders. When you can explain why a finding matters in terms of how Google actually works, the recommendations land differently than when you present them as abstract technical issues.

The design and structure of your site also shapes how audit findings translate into action. HubSpot’s treatment of web design and SEO covers how design decisions affect search performance, which is useful context when your audit findings require design changes rather than purely technical fixes.

If you are working through a broader SEO programme and want to see how the audit connects to the rest of the strategy, the Complete SEO Strategy Hub covers each component in detail. The audit is the diagnostic layer. What you do with the findings determines whether it was worth doing.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

How long does a website audit take?
For a small site of under 500 pages, a thorough audit covering technical, on-page, and content dimensions typically takes two to four days. For larger sites, particularly e-commerce sites with thousands of URLs, a comprehensive audit can take two to three weeks. The crawl itself is fast. The analysis, prioritisation, and report production is where the time goes.
What is the difference between a technical SEO audit and a full website audit?
A technical SEO audit focuses specifically on the infrastructure layer: crawlability, indexability, site speed, redirect chains, and structured data. A full website audit includes the technical layer but also covers on-page optimisation, content quality, keyword alignment, and backlink profile. For most sites, a technical audit alone will not identify the most significant ranking opportunities.
Can I run a website audit myself or do I need an agency?
You can run a meaningful audit yourself using Google Search Console, Screaming Frog, and either Ahrefs or Semrush. The tools are accessible and well-documented. The challenge is not running the crawl. It is knowing which findings matter commercially and how to prioritise them. If your site is large, technically complex, or if you have experienced a significant traffic drop, specialist input is worth it. For most small to medium sites, a structured self-audit with the right tools will surface the most important issues.
How do I know if my website audit findings are accurate?
Cross-reference findings across multiple tools. If Screaming Frog flags an issue that does not appear in Google Search Console, investigate before acting. Tools have different crawl behaviours and different ways of classifying issues. Google Search Console data should always take precedence for indexation and coverage questions because it reflects what Google actually sees. Third-party tools are estimates. Search Console is as close to ground truth as you can get.
What should I fix first after a website audit?
Start with anything that is preventing your most commercially important pages from being indexed and ranked correctly. Indexation errors, broken redirects on high-traffic URLs, and cannibalisation issues affecting your primary keyword targets are the highest priority. After those, work through on-page fixes for pages that are ranking on page two or three and have clear optimisation gaps. Save low-traffic, low-impact fixes for last. The order of operations matters more than the comprehensiveness of the list.

Similar Posts