Forrester Martech: What the Data Says About Where Teams Are Going Wrong

Forrester’s martech research has consistently pointed to the same uncomfortable truth: most marketing teams are not getting meaningful returns from their technology investments. The tools are there, the budgets have been spent, but the operational reality inside most organisations is fragmented stacks, low adoption rates, and a growing gap between what the software promises and what it actually delivers.

That gap is not a technology problem. It is a strategy problem, and Forrester’s analysts have been saying so for years.

Key Takeaways

  • Forrester’s martech research consistently identifies low utilisation, not lack of tools, as the primary failure point for marketing teams.
  • The average marketing stack has grown significantly over the past decade, but capability and complexity are not the same thing.
  • Martech value is determined by how well tools connect to business outcomes, not by how sophisticated they appear on a vendor slide.
  • Organisational structure and internal skills gaps are more likely to limit martech ROI than the tools themselves.
  • The consolidation trend Forrester has tracked is a correction, not a crisis: teams that rationalise their stacks tend to perform better than those that keep accumulating.

What Forrester Actually Measures in Martech

Forrester approaches martech differently from most analyst firms. Where some focus heavily on vendor capability rankings, Forrester has built a body of research around how organisations buy, deploy, and fail to use marketing technology. Their work covers the full lifecycle: procurement decisions, integration challenges, skills requirements, and the business outcomes that mayor may not follow.

Their Wave reports on marketing automation, customer data platforms, and digital experience platforms are widely used by procurement teams and CMOs evaluating vendors. But the more instructive research, in my view, is the operational work: the surveys and analyses that ask whether marketing technology is actually being used to its potential, and whether the investment is connecting to revenue.

The answer, repeatedly, is that it is not. Not because the tools are bad, but because the surrounding conditions are not in place. The skills are thin, the processes are undefined, and the integration between martech and the broader business is loose at best.

If you are thinking carefully about how your team operates around its tools, the broader marketing operations hub is worth working through. The Forrester findings connect directly to the operational decisions most teams are grappling with right now.

The Utilisation Problem Nobody Wants to Admit

There is a number that comes up repeatedly in Forrester’s martech research: the proportion of purchased martech capability that marketing teams actually use. The figure is consistently low, often dramatically so. Teams are paying for platforms they are using at a fraction of capacity, renewing contracts on tools that have become shelfware, and adding new software on top of existing software that was never properly embedded.

I have seen this pattern up close. When I was growing an agency from around 20 people to over 100, we went through a period where the answer to every operational problem seemed to be a new tool. Analytics platform not giving us what we needed? Add another layer. Campaign management getting complicated? Buy a solution. What we ended up with was a stack that required three people to maintain and that nobody fully understood. The irony was that the tools we used most effectively were the ones we had invested time in properly, not the ones with the longest feature lists.

Forrester’s research frames this as a maturity issue. Organisations at lower maturity levels tend to accumulate tools reactively, responding to vendor pitches and internal pressure rather than building from a clear operational model. Higher maturity organisations start from the process and work backwards to the technology.

The distinction sounds obvious. In practice, it is genuinely difficult to hold the line when a vendor is demonstrating a compelling product and your team is asking why you do not have it yet.

Where Forrester’s Vendor Assessments Get Interesting

Forrester’s Wave methodology scores vendors across a defined set of criteria: current offering, strategy, and market presence. For marketing teams, the Wave reports on categories like marketing automation platforms, B2B customer data platforms, and real-time interaction management are reference points during procurement.

What makes these reports useful is not the quadrant position. It is the criteria breakdown. Forrester tends to weight operational factors, integration capability, and customer success infrastructure heavily, which pushes back against the tendency to buy on feature count alone. A vendor with a strong current offering score but a weak strategy score is telling you something about where the product is heading. A strong strategy score with thin market presence tells you something else.

The limitation of any analyst report, Forrester included, is that it reflects a point in time and a generalised buyer profile. A mid-market B2B team has different requirements from an enterprise retail operation, and the Wave does not always surface those differences with enough granularity. I have seen teams make poor procurement decisions by treating a Forrester Wave as a buying guide rather than a starting point for their own evaluation.

The smarter approach is to use the Wave to understand the competitive landscape and the key differentiators, then do your own structured evaluation against your specific use cases. Forrester’s criteria are a good checklist. They are not a substitute for knowing what you actually need.

The Stack Consolidation Trend Forrester Has Been Tracking

Forrester has documented a clear shift in how marketing organisations are approaching their stacks. After a decade of expansion, many teams are now in consolidation mode: reducing vendor count, renegotiating contracts, and trying to get more out of fewer platforms.

This is partly a budget response. When marketing budgets come under pressure, the sprawling stack becomes a liability. Procurement teams start asking hard questions about what each tool is actually delivering, and the answers are often uncomfortable. Tools that were acquired during periods of growth and optimism look different when someone is running a line-by-line cost review.

But consolidation is also a maturity response. Teams that have been through the accumulation phase and come out the other side tend to have a clearer view of what they actually need. They have learned, usually the hard way, that integration complexity grows faster than the value of adding another tool. They have also learned that vendor relationships require ongoing investment: training, support, configuration, and change management. Every tool you add is a commitment, not just a purchase.

Forrester’s research suggests that consolidation, done well, does not mean capability reduction. Teams that rationalise thoughtfully, keeping the tools that are genuinely embedded and removing the ones that are not, tend to end up with better operational coherence. The stack gets smaller but more effective.

The risk is consolidation done badly: cutting tools because they are expensive without understanding what workflows depend on them, or consolidating onto a single vendor platform for commercial reasons rather than operational ones. Platform lock-in is a real cost that does not always show up in the initial business case.

What Forrester Says About AI in the Martech Stack

Forrester has published extensively on AI in marketing, and the picture is more nuanced than the vendor community would have you believe. The research acknowledges genuine capability improvements in areas like content generation, predictive analytics, and audience segmentation. It also identifies a persistent gap between AI capability and AI deployment: most marketing teams are not set up to use AI tools effectively, even when those tools are technically available to them.

The barriers Forrester identifies are familiar: data quality issues, skills gaps, unclear ownership, and the absence of a clear use case framework. Teams are being sold AI features as part of existing platform upgrades and are not sure what to do with them. The feature exists, but the process to use it does not.

I judged the Effie Awards, and one thing that experience reinforced was how rarely technology is the differentiating factor in effective marketing. The campaigns that stand out tend to be built on a clear understanding of the audience, a sharp creative idea, and disciplined execution. The technology is in service of that. When AI shows up in effective work, it is usually doing something specific and well-defined, not being deployed as a general solution to a vague problem.

Forrester’s framing on AI is broadly consistent with that observation. The value comes from identifying specific tasks where AI genuinely improves speed or quality, not from adopting AI as a strategic posture. The teams getting real returns from AI in their martech stacks tend to be the ones that have defined the use case precisely before buying the tool.

The Skills Gap That Forrester Keeps Coming Back To

Across multiple research streams, Forrester returns to the same structural problem: marketing technology has outpaced the skills available to deploy it. The platforms have become more sophisticated faster than marketing teams have developed the capability to use them.

This is not a new observation, but it is one that the industry has been slow to act on. There is a tendency to treat martech as a procurement problem when it is fundamentally a people problem. You can buy the best customer data platform on the market, but if nobody in your team understands data modelling, audience segmentation logic, or how to connect platform outputs to campaign decisions, the investment will underperform.

Early in my career, I asked for budget to build a new website and was told no. So I taught myself to code and built it. That is not a story about resourcefulness for its own sake. It is a story about the gap between what organisations think they need (a vendor) and what they actually need (someone who understands the problem well enough to solve it). The same logic applies to martech. The tool is not the answer. The capability to use it is.

Forrester’s research points to a few practical responses to the skills gap: investing in training rather than just procurement, building hybrid roles that combine marketing and technical skills, and being honest about what your team can realistically operate before committing to a platform. The structure of your marketing team has a direct bearing on which tools you can actually use effectively, and that conversation needs to happen before the vendor evaluation, not after.

How Forrester Thinks About Martech ROI

One of the more useful contributions Forrester has made to the martech conversation is pushing back on the idea that ROI from marketing technology is straightforward to measure. It is not. The value of a CRM, a marketing automation platform, or a customer data platform is distributed across multiple functions and time horizons. Attributing a specific revenue outcome to a specific tool is often not possible with any precision.

What Forrester recommends instead is a more honest accounting: what decisions does this tool enable that we could not make otherwise? What processes does it improve, and what is the commercial value of that improvement? What is the cost of not having it, rather than just the cost of having it?

This framing is more useful than trying to build a precise attribution model for a platform that touches dozens of workflows. It also forces a clearer conversation about what the tool is actually for. If you cannot articulate what decisions it enables, that is a signal worth taking seriously.

The budget planning implications are significant. If you are trying to build a defensible case for martech investment, the structure of your marketing budget needs to account for implementation, training, and ongoing optimisation, not just licence fees. Forrester’s research consistently shows that teams underestimate the total cost of ownership for martech platforms, which is part of why the ROI disappointment is so common.

What Forrester’s Research Means for How You Run Your Stack

If you take Forrester’s body of martech research seriously, a few practical implications follow.

First, audit what you have before you buy anything new. Not a superficial audit, but a genuine assessment of which tools are embedded in real workflows, which are being used at a fraction of their capacity, and which have become shelfware. The answer will probably be uncomfortable, but it is the only honest starting point.

Second, treat skills as a constraint, not an afterthought. Before any procurement decision, ask whether your team has the capability to use the tool effectively. If the answer is no, the question is whether you can build that capability, not whether the tool is good.

Third, connect every tool to a specific business outcome. Not a vague capability improvement, but a defined outcome: faster campaign deployment, better audience segmentation, reduced cost per acquisition. If you cannot make that connection clearly, the business case is not ready.

Fourth, take integration seriously as a cost. Every tool you add creates integration complexity. Forrester’s research is clear that integration failure is one of the most common reasons martech investments underperform. Budget for it, plan for it, and be sceptical of vendor claims about how easy it will be.

When I was managing significant paid media budgets across multiple clients, the campaigns that worked best were rarely the ones with the most sophisticated technology. They were the ones where the team understood the process end to end, from audience insight through to measurement. Technology accelerated that process. It did not replace it. Forrester’s research, read carefully, points to the same conclusion. The inbound marketing process and the martech stack that supports it are only as good as the strategic clarity behind them.

For a broader view of how these operational decisions connect across the marketing function, the marketing operations section covers the territory in more depth, from team structure through to measurement and budget planning.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What does Forrester measure in its martech research?
Forrester’s martech research covers vendor capability assessments through its Wave methodology, as well as operational research into how organisations buy, deploy, and use marketing technology. The operational work, which examines utilisation rates, skills gaps, and business outcomes, tends to be more useful for marketing leaders than the vendor rankings alone.
Why do Forrester’s martech findings consistently show low ROI?
Forrester’s research points to a combination of low tool utilisation, skills gaps, poor integration between platforms, and a disconnect between technology investment and defined business outcomes. The problem is rarely the quality of the tools. It is the absence of the operational conditions needed to use them effectively.
How should marketing teams use Forrester Wave reports when evaluating martech vendors?
Forrester Wave reports are best used as a starting point for vendor evaluation, not a buying guide. The criteria breakdown is more useful than the quadrant position. Teams should use the Wave to understand competitive differentiation and key capability areas, then run their own structured evaluation against their specific use cases and operational context.
What does Forrester say about AI in martech?
Forrester acknowledges genuine AI capability improvements in areas like content generation, predictive analytics, and segmentation, but consistently identifies a gap between what AI tools can do and what marketing teams are set up to use. The research suggests that value comes from defining specific, well-scoped use cases rather than adopting AI as a broad strategic posture.
Is martech consolidation a sign that the market is contracting?
Not necessarily. Forrester frames consolidation as a maturity response rather than a market contraction. Teams that have been through a period of stack accumulation are rationalising to improve operational coherence and reduce integration complexity. Done thoughtfully, consolidation tends to improve martech effectiveness rather than reduce capability.

Similar Posts