Website Content Audit: What to Keep, Cut, and Fix
A website content audit is a structured review of every page on your site, assessed against performance data, business objectives, and audience relevance. Done properly, it tells you which content is earning its place, which is quietly dragging down your organic performance, and which gaps are costing you pipeline.
Most marketing teams treat the audit as a one-off spring clean. That framing undersells it. A content audit is one of the highest-leverage strategic activities you can run, because it forces you to make decisions based on evidence rather than editorial instinct or internal politics.
Key Takeaways
- A content audit is not a tidying exercise. It is a commercial decision-making process that should connect directly to traffic, conversion, and revenue goals.
- The four audit outcomes , keep, consolidate, update, remove , each require different criteria. Applying the wrong one to a page is as costly as ignoring the page entirely.
- Thin content and cannibalisation are the two most common issues found in audits of established sites, and both are fixable with the right process.
- Audit cadence matters as much as audit depth. A quarterly light-touch review outperforms an annual deep dive that produces a spreadsheet no one acts on.
- The audit is only as useful as the brief that precedes it. Without clear business objectives, you will optimise for the wrong signals.
In This Article
- Why Most Content Audits Produce a Spreadsheet, Not a Strategy
- What to Include in a Website Content Audit
- The Four Decisions: Keep, Consolidate, Update, Remove
- How to Prioritise What to Act On First
- The Role of Search Intent in Audit Decisions
- Cannibalisation: The Problem Most Sites Do Not Know They Have
- How Often Should You Run a Content Audit
- Building the Audit Into Your Editorial Process
- The Metrics That Actually Matter in a Content Audit
Why Most Content Audits Produce a Spreadsheet, Not a Strategy
I have sat in enough agency review meetings to know the pattern. Someone exports a crawl from Screaming Frog, pulls traffic data from Google Analytics, pastes it into a shared spreadsheet, colour-codes the rows, and presents it as an audit. The spreadsheet is accurate. The problem is that it answers the wrong question.
Data without a decision framework is just inventory. What you need from a content audit is not a list of pages and their metrics. You need a set of prioritised actions tied to business outcomes. That distinction sounds obvious, but the majority of audits I have reviewed, both in-house and from agencies, stop at the inventory stage and call it done.
The reason this happens is usually one of two things. Either the audit was commissioned without a clear brief, so the person running it defaults to what is measurable rather than what is meaningful. Or the brief existed but was too vague, something like “we want to improve our content,” which gives you no basis for making hard calls about what stays and what goes.
Before you pull a single URL, you need to answer three questions. What are the commercial objectives this content is supposed to support? Which audience segments does it need to serve? And what does success look like in measurable terms? If you cannot answer those three questions with specificity, the audit will drift toward comfort rather than clarity.
The broader discipline of content strategy, including how audits connect to editorial planning, distribution, and measurement, is covered in the Content Strategy and Editorial hub on The Marketing Juice. That context is worth having before you start pulling data.
What to Include in a Website Content Audit
Scope is the first decision, and it is worth being deliberate about it. A full-site audit of a 2,000-page website is a different undertaking from a targeted audit of your blog archive or your product pages. Both are legitimate, but they serve different purposes and require different resource commitments.
For most B2B and mid-market sites, a full content audit should cover every indexable page: blog posts, landing pages, product and service pages, case studies, resource pages, and any ungated content that search engines can reach. Gated content, internal documents, and pages blocked by robots.txt can be reviewed separately if there is a specific reason to do so.
For each page, you want to capture a consistent set of data points. On the technical side: URL, title tag, meta description, word count, canonical status, index status, and any crawl errors. On the performance side: organic sessions over a defined period, rankings for primary target keywords, backlink count, and conversion events if tracked at page level. On the editorial side: topic, content type, target audience, funnel stage, and date of last update.
The SEMrush team have published a thorough walkthrough of the technical process of running a content audit, including how to structure your data collection and which tools to use at each stage. It is a practical reference if you are building the process from scratch.
One thing I would add from experience: do not let the data collection phase expand to fill the available time. I have seen teams spend six weeks gathering data for an audit that needed two weeks of analysis and four weeks of action. Set a data collection deadline and hold to it. Imperfect data acted on quickly beats perfect data that arrives too late to influence anything.
The Four Decisions: Keep, Consolidate, Update, Remove
Every page in your audit should end up in one of four buckets. The decision criteria for each bucket are different, and conflating them is where most audits go wrong.
Keep
Pages that are performing well against their stated objectives and are current, accurate, and well-structured. These pages need monitoring, not intervention. The temptation to tinker with high-performing content is real, particularly in teams that measure activity rather than outcomes. Resist it. If a page is ranking, converting, and earning links, your job is to protect it, not redesign it because someone wants to refresh the brand voice.
Consolidate
Pages that cover overlapping topics and are cannibalising each other in search. This is one of the most common issues I encounter on sites that have been publishing content for three or more years without a formal content taxonomy. You end up with four blog posts targeting variations of the same keyword, none of which ranks particularly well because Google cannot determine which one to prioritise.
The fix is to merge the best elements of the overlapping pages into a single, authoritative piece, redirect the others to it, and update internal linking to point to the consolidated URL. Done properly, this typically produces a measurable improvement in rankings within two to three months. Done poorly, it produces a longer page that is still unclear about what it is trying to say.
Update
Pages with genuine search demand and a clear topic focus, but whose content is outdated, thin, or structurally weak. These are your highest-return opportunities. You are not starting from scratch; you are improving something that already has some equity. Prioritise pages with existing rankings in positions 5 to 20, where a meaningful content improvement has a realistic chance of moving the needle.
When updating, do not just add words. Revisit the search intent behind the target keyword. If the intent has shifted since the page was first written, the structure of the page may need to change, not just the content within it. Moz have written clearly about aligning content goals to measurable KPIs, which is a useful lens to apply when deciding how far to take an update.
Remove
Pages with no traffic, no backlinks, no ranking potential, and no strategic purpose. These pages are not neutral. Thin or low-quality content can dilute the overall quality signals of your domain, and a large volume of it can slow crawl efficiency on larger sites. Removing them, with appropriate redirects where relevant, is a legitimate optimisation.
The removal decision is the one that generates the most internal resistance, usually from whoever commissioned or wrote the content in question. My approach has always been to make the decision criteria explicit before the audit begins, so that the call to remove a page is a function of the agreed framework rather than a personal judgement. That removes most of the friction.
How to Prioritise What to Act On First
An audit of a mature site will typically surface more actions than any team can reasonably execute in a quarter. Prioritisation is not optional; it is where the strategic thinking happens.
I use a simple two-axis framework: commercial impact against execution effort. High impact, low effort actions go first. These are typically the consolidation and update decisions on pages that already have some ranking traction. Low impact, high effort actions go last or get deprioritised entirely, regardless of how interesting the content brief looks.
Within that framework, I weight commercial proximity heavily. A page that sits one click from a conversion event gets more attention than a top-of-funnel awareness piece with similar traffic. That is not a universal rule, but it is a useful default for teams working under resource constraints, which is most teams.
Early in my agency career, I inherited a client content programme that had produced over 400 blog posts in three years without a single piece of structured measurement. The client was frustrated that organic traffic had plateaued. When we ran the audit, we found that 60% of the posts had received fewer than 50 sessions in the previous 12 months, and 30 posts were accounting for over 80% of total organic traffic. We consolidated and removed aggressively, redirected the equity, and rebuilt the editorial calendar around the topics that were actually driving commercial traffic. Within six months, organic sessions were up and the team was producing half the volume of content they had been before. Less content, better decisions, better results.
The Content Marketing Institute have a useful framework for thinking about measurement and content performance that is worth reviewing when you are setting your prioritisation criteria.
The Role of Search Intent in Audit Decisions
Traffic data alone will mislead you. A page with 2,000 monthly sessions from informational queries is not equivalent to a page with 200 sessions from high-intent commercial queries. The audit needs to account for intent, not just volume.
For each page you are evaluating, ask what the person searching for the primary keyword actually wants to do. Are they researching a topic? Comparing options? Ready to buy or enquire? A page targeting an informational query that converts at 0.1% is not underperforming; it is doing exactly what informational content should do. Judging it by the same conversion benchmark as a product page is a category error.
Where intent analysis becomes particularly useful in an audit is in identifying pages that are attracting the wrong audience. I have seen service pages ranking well for informational queries because the content was written to explain the category rather than to convert a buyer. The traffic looked healthy in the dashboard. The pipeline contribution was negligible. Rewriting those pages with a clearer commercial intent, and moving the explanatory content into a separate resource, produced a meaningful shift in conversion rate without any change to the technical setup.
Copyblogger have written a clear piece on the relationship between SEO and content marketing that is worth reading if you are working through how intent should shape your content decisions at a structural level.
Cannibalisation: The Problem Most Sites Do Not Know They Have
Keyword cannibalisation happens when multiple pages on your site compete for the same search query. Google has to choose which page to rank, and it often gets it wrong, or rotates between them, which produces unstable rankings and diluted authority.
The audit is your opportunity to map this systematically. For each target keyword in your content inventory, identify how many pages are targeting it or closely related variants. Where you find more than one, you have a decision to make: consolidate into a single authoritative page, or differentiate the intent clearly enough that Google treats them as distinct.
The consolidation route is usually cleaner. The differentiation route requires genuine editorial discipline to execute, because the natural tendency is to produce pages that look different on the surface but are fundamentally competing for the same query. If you cannot articulate in one sentence why these two pages serve different user needs, they probably need to be merged.
Moz have a good walkthrough of content strategy diversification that touches on how to structure your content so that different pieces serve genuinely different purposes rather than accidentally competing with each other.
How Often Should You Run a Content Audit
The full-site audit is not a monthly activity. For most organisations, a comprehensive audit once a year is appropriate, with a lighter quarterly review of your highest-traffic and highest-value pages in between.
The quarterly review does not need to be a full crawl and data export. It can be a structured check of your top 20 to 30 pages against their target metrics, a scan for any pages that have dropped significantly in rankings, and a review of any new content published in the previous quarter to confirm it is indexed and performing as expected.
The trigger for an unscheduled audit is a meaningful drop in organic traffic that is not explained by a known algorithm update or seasonal pattern. If your traffic drops 20% and you cannot attribute it to an external cause, an audit of your top-performing pages is the right first step. You are looking for pages that have lost rankings, new cannibalisation issues from recently published content, or technical problems that have emerged since your last crawl.
One thing I have learned from running audits across a wide range of sectors is that the cadence matters less than the commitment to act on findings. I have seen organisations run quarterly audits and ignore 90% of the recommendations because the team was too stretched to execute. I have also seen organisations run a single well-structured annual audit and systematically work through every action item over 12 months. The second approach produces better outcomes, consistently.
Building the Audit Into Your Editorial Process
The most effective content teams I have worked with treat the audit as a standing input to their editorial planning, not a separate project that runs in parallel. Before commissioning new content, they check whether the topic is already covered, whether an existing page could be updated to address the need, and whether the new piece would risk cannibalising something that is already ranking.
That discipline requires a content inventory that is kept reasonably current, which means updating it whenever new content is published and flagging any pages that are removed or redirected. It is not a heavy administrative burden if it is built into the workflow from the start. It is a significant burden if you let the inventory drift and then try to reconcile it every 12 months.
The HubSpot resource on content distribution strategy is worth reviewing in this context, because distribution planning and audit findings should inform each other. If a piece of content is not performing, the problem is sometimes the content itself, and sometimes it is that the content was never adequately distributed or promoted after publication.
When I was scaling the content operation at iProspect, we introduced a simple rule: no new content brief could be approved without a check against the existing inventory. It sounds obvious. It was not standard practice. The result was fewer pieces of content produced, but a higher proportion of them performing against their objectives, because we were not constantly diluting our own authority by publishing overlapping material.
If you want a broader view of how content audits connect to editorial planning, measurement, and content operations, the Content Strategy and Editorial section of The Marketing Juice covers the full picture across strategy, execution, and commercial alignment.
The Metrics That Actually Matter in a Content Audit
Not all metrics deserve equal weight in an audit. The ones that matter most are the ones that connect most directly to your business objectives. For most organisations, that means organic sessions, keyword rankings, conversion events, and backlink equity. Everything else is context.
Engagement metrics like time on page and scroll depth have a role, but they are supporting evidence rather than primary decision signals. A page with high time on page and zero conversions is not necessarily a success. A page with low time on page and a high conversion rate is not necessarily a failure. Read the metrics in combination, not in isolation.
One metric that is consistently underused in content audits is assisted conversions. A page may not be the last touchpoint before a conversion, but if it consistently appears in the conversion path, removing or significantly altering it could have downstream effects that are not visible in last-click attribution. If your analytics setup allows you to see assisted conversion data at the page level, include it in your audit framework.
I judged the Effie Awards for several years, and one thing that consistently separated the stronger entries from the weaker ones was the ability to connect content and campaign activity to commercial outcomes rather than just engagement metrics. The same discipline applies to a content audit. If you cannot draw a line from a page to a business outcome, you need to either establish that line or question whether the page should exist at all.
The Content Marketing Institute’s content marketing resources include a range of measurement frameworks that can help you build a more commercially grounded approach to evaluating content performance.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
