Competition Analysis: A Real Example From Start to Output
A competition analysis example is worth more than any framework. Frameworks tell you what to do. A worked example shows you what you actually find, what it means, and what to do with it. This article walks through a real-world competitive analysis, from the initial brief to the strategic output, so you can see how the process works in practice rather than in theory.
The example draws on a situation I’ve encountered in various forms across two decades of agency work: a mid-sized brand entering a market where two or three established players already own most of the visible real estate. The names are composite, but the data patterns, the decisions, and the missteps are real.
Key Takeaways
- A competition analysis is only useful if it produces a decision. If you finish and don’t know what to do differently, you’ve produced a report, not an insight.
- Start with commercial context, not data collection. Knowing why you’re analysing competitors shapes what you look for and prevents you drowning in irrelevant information.
- Search share and organic visibility are often the most revealing competitive signals because they reflect sustained investment, not a single campaign decision.
- Gaps in competitor content are more actionable than their strengths. You can’t out-spend an incumbent, but you can out-position them in underserved areas.
- The most common failure in competitive analysis is confusing observation with conclusion. What a competitor does is not automatically what you should do.
In This Article
- What Was the Brief?
- Who Were the Competitors?
- What Did the Search Analysis Show?
- What Did the Paid Search Analysis Show?
- What Did the Content Audit Show?
- What Did the UX and Conversion Analysis Show?
- How Did We Turn the Analysis Into a Strategy?
- What Does This Example Tell You About Competitive Analysis in General?
What Was the Brief?
The brand in this example, call them Vantage, was a B2C financial services company preparing to expand into a new product category. They had strong brand recognition in their existing niche but limited visibility in the new space. The commercial question was simple: where should we compete, and what would it take to get traction within 12 months?
That question matters more than it sounds. I’ve seen competitive analysis projects that start without a clear commercial question, and they almost always produce the same output: a 40-slide deck full of screenshots and traffic estimates that nobody acts on. The brief has to define what a useful answer looks like before you collect a single data point.
In this case, the brief had three specific sub-questions. First, who are the dominant players in the new category and what are they doing well? Second, where are the gaps in their coverage that Vantage could plausibly own? Third, what would it cost in paid search to compete for the highest-intent terms, and is that cost commercially viable?
Those three questions shaped everything that followed. If you’re building your own competitive analysis, I’d encourage you to read more broadly about research methodology first. The Market Research and Competitive Intel hub covers the tools, frameworks, and strategic principles that sit behind the kind of worked example you’re reading here.
Who Were the Competitors?
The first step was building the competitor set. This sounds obvious, but it’s where most teams make their first mistake. They default to the brands they already know, which means they start with a list shaped by assumption rather than evidence.
We built the competitor set from three sources. Organic search results for the 20 highest-volume category terms. Paid search results for the same terms. And direct client intelligence, which in this case meant asking the sales team who prospects mentioned when they were shopping around.
The results were instructive. Three brands appeared consistently across all three sources. Two more appeared only in paid search, suggesting they were buying visibility without the organic authority to sustain it. One brand the client had assumed was a major competitor barely appeared at all, which told us something useful about how they’d positioned themselves in the market.
That last finding was worth more than a month of brand tracking. The brand Vantage had been worried about wasn’t actually competing in the same space. They’d moved upstream into enterprise. That changed the competitive map entirely.
What Did the Search Analysis Show?
Search is where I start almost every competitive analysis, because it’s the most honest signal available. Brands can say whatever they like in their positioning. But their search footprint reflects where they’ve actually invested time and money over months or years. You can’t fake organic authority, and you can’t sustain paid presence without commercial intent behind it.
For Vantage, we ran keyword gap analysis across the three primary competitors using Semrush. The output showed that Competitor A dominated informational terms, the “what is” and “how does” queries that sit at the top of the funnel. Competitor B owned most of the high-intent transactional terms. Competitor C had strong visibility in comparison content, the “X vs Y” and “best [product]” formats that capture people who are close to a decision but still evaluating options.
What none of them had done well was the middle layer. The queries that sit between education and transaction, where someone understands the category but is trying to figure out which type of solution fits their situation. These are often the highest-converting queries precisely because the user has done enough research to know what they want, but hasn’t yet been captured by a brand.
That gap became the first strategic recommendation. Vantage should build content that serves the evaluation stage, not the awareness stage. Not “what is this product” but “which version of this product is right for me and why.” That’s a different brief for the content team, and it came directly from the search analysis rather than from gut instinct.
I’ve seen this pattern across industries from retail to professional services. The incumbents tend to cluster around the same content formats because they’re following each other rather than following the user. That clustering creates gaps, and gaps are where a challenger brand can actually build a foothold without needing to outspend anyone.
What Did the Paid Search Analysis Show?
Paid search analysis serves a different purpose from organic. It tells you about commercial intent and short-term competitive pressure rather than long-term investment. It also gives you cost data, which is the thing most competitive analysis frameworks ignore entirely.
I spent several years managing large paid search budgets at iProspect, and the thing that always struck me when I took on a new client was how rarely they knew what their competitors were actually spending or bidding on. They’d have assumptions, often wildly inaccurate ones, and those assumptions were shaping their own bidding strategy.
For Vantage, we pulled estimated CPCs for the top 50 category terms and mapped them against search volume. The picture was stark. The ten highest-volume terms had CPCs that would make the economics unworkable at Vantage’s current conversion rate. Competing there would mean buying traffic at a cost that no reasonable customer lifetime value could justify.
But the next tier down, terms with meaningful volume and much lower competition, told a completely different story. Several of these terms had CPCs that were commercially viable, and they aligned with the evaluation-stage content gap we’d already identified in the organic analysis. That alignment matters. When your organic strategy and your paid strategy point at the same audience segment, you’re building something coherent rather than running two disconnected programmes.
The paid analysis also revealed something about competitor behaviour. One of the three primary competitors was bidding aggressively on branded terms for the other two. That’s a signal. It means they’re confident enough in their conversion rate to buy traffic from users who are already in market and already considering alternatives. It also means their cost-per-acquisition on those terms is probably quite efficient, which tells you something about their margins.
What Did the Content Audit Show?
Content auditing is the part of competitive analysis that most teams either skip or do superficially. They look at competitor websites and note what topics they cover. That’s not an audit. An audit looks at what they publish, how it performs, what formats they use, and critically, what they don’t cover and why.
For each of the three primary competitors, we mapped their content against a topic matrix built from the keyword research. We were looking for three things: topics they covered well with strong organic performance, topics they’d attempted but where the content was thin or outdated, and topics they hadn’t touched at all.
The third category is the most interesting. When a well-resourced competitor hasn’t covered a topic, there are usually only a few explanations. Either the topic is too niche to justify the investment at their scale, or it doesn’t fit their brand positioning, or they’ve tried it and it didn’t perform. Each of those explanations has different implications for whether you should pursue it.
In Vantage’s case, there was a cluster of topics related to a specific use case that all three competitors had ignored. The search volume wasn’t enormous, but it was consistent, and the user intent was extremely high. People searching those terms were not browsing. They had a specific problem and were looking for a specific solution. Vantage had a product that addressed that problem directly. None of their competitors did, which is presumably why none of them had invested in the content.
That became the second strategic recommendation. Build a content cluster around this use case before anyone else does. Own the terms, build the authority, and use that position to capture a segment of the market that the incumbents have structurally ignored.
What Did the UX and Conversion Analysis Show?
Competitive analysis usually stops at traffic and content. That’s a mistake, because the conversion layer is where competitive advantage is often most visible and least discussed.
We ran a structured review of each competitor’s conversion experience, from landing page to sign-up or purchase, documenting the number of steps, the information they asked for, the trust signals they used, and the friction points we could identify. This isn’t a perfect science. You can’t see their conversion rates. But you can make informed judgements about where their journeys are likely to lose people, and you can compare those judgements against your own data.
Competitor A had a experience that required more information upfront than was necessary. The form was long, the value proposition wasn’t reinforced at the point of commitment, and the mobile experience was noticeably worse than desktop. Competitor B had a cleaner experience but weak social proof. Competitor C had the best overall UX but a confusing pricing structure that introduced uncertainty at exactly the wrong moment.
None of these are catastrophic. These are well-run businesses with presumably acceptable conversion rates. But they represent specific places where Vantage could build an advantage without needing to compete on price or brand awareness. A shorter sign-up flow, stronger trust signals at the point of commitment, and a pricing structure that’s easier to understand. Those are execution decisions, not strategy decisions, but they’re informed by competitive analysis rather than internal preference.
Understanding how users actually behave on sites like these is something tools like Hotjar can help with on your own properties, giving you behavioural data that you can then benchmark against what you observe in competitor journeys.
How Did We Turn the Analysis Into a Strategy?
This is the step that separates useful competitive analysis from expensive decoration. Everything up to this point is observation. Strategy is what you decide to do with it.
For Vantage, the analysis produced three clear strategic priorities. First, build content authority in the evaluation stage, where all three competitors were underinvested and where Vantage’s product had genuine differentiation to communicate. Second, run paid search against the mid-tier terms where CPCs were commercially viable, rather than chasing the high-volume terms where the economics didn’t work. Third, invest in the conversion experience before scaling traffic, because bringing more users into a experience that leaks at the same rate as the competitors isn’t a competitive advantage.
These priorities were sequenced. Content authority takes time to build, so that work starts immediately. Paid search runs in parallel but with a tightly controlled budget until the conversion experience is improved. Conversion optimisation is the gate that unlocks the paid scale-up.
I’ve found that sequencing is one of the most undervalued parts of strategy. Anyone can produce a list of things to do. Knowing which order to do them in, and why that order matters commercially, is what separates a strategy from a to-do list. Good content strategy thinking, including how to structure and sequence content investment, is something the team at Unbounce has explored thoughtfully in the context of conversion and content alignment.
The final output wasn’t a 40-slide deck. It was a two-page strategic brief with three priorities, a sequencing rationale, and a set of metrics that would tell us within 90 days whether the approach was working. That’s the format I’ve used for most of my career. Long documents get read once. Short ones get used repeatedly.
What Does This Example Tell You About Competitive Analysis in General?
A few things stand out when I look back at this example and compare it to the dozens of competitive analyses I’ve been involved in over the years.
The brief matters more than the tools. We used standard tools throughout this process. Nothing exotic, nothing expensive. The quality of the output came from asking the right questions at the start, not from having access to proprietary data.
Gaps are more valuable than benchmarks. Most competitive analysis is framed as benchmarking: how do we compare? That’s a useful question, but it’s a defensive question. The more useful question is: where are they not competing, and why? That’s where growth comes from.
Commercial viability has to be part of the analysis. I’ve seen competitive analyses that identify opportunities without ever asking whether pursuing those opportunities is economically sensible. Knowing that a competitor ranks for a term is useful. Knowing whether you can afford to compete for that term, and whether winning it would generate enough return to justify the cost, is what makes the analysis actionable.
And finally: don’t copy what competitors are doing. This sounds obvious, but it’s the most common failure mode I see. Teams run a competitive analysis, see what the market leader is doing, and then try to replicate it. That’s a strategy for being permanently second. The point of understanding what competitors are doing is to find where they’re not doing it well enough, and to be different there, not to be a slightly inferior version of the same thing.
When Forrester looked at the marketing skills gap, one of the recurring themes was the gap between analytical capability and strategic application. Marketers can collect data. The harder skill is knowing what to do with it. That’s what competitive analysis is really testing.
If this example has prompted you to think more carefully about how your own competitive intelligence programme is structured, the broader Market Research and Competitive Intel hub covers the full landscape, from which tools to use, to how to build a monitoring programme that actually surfaces useful signals rather than noise.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
