Forrester CX Index: What It Measures and Why It Matters

The Forrester Customer Experience Index is an annual benchmarking study that scores how well large brands deliver experiences that are effective, easy, and emotionally resonant for their customers. It draws on direct consumer surveys across dozens of industries, making it one of the more rigorous external benchmarks available to senior marketers who want to understand how their brand’s experience compares to competitors, not just to its own historical performance.

What separates it from most CX measurement is the emphasis on emotion. Forrester’s framework treats emotional quality as the single strongest driver of customer loyalty, which puts it in direct tension with the efficiency metrics most operations teams actually manage against.

Key Takeaways

  • The Forrester CX Index scores brands on effectiveness, ease, and emotion, with emotional quality consistently proving the strongest predictor of loyalty in their framework.
  • Most brands cluster in the “OK” band of the Index, meaning differentiation through experience is genuinely achievable for companies willing to act on the data.
  • Forrester’s research consistently flags that companies over-invest in digital self-service and under-invest in the human interactions that actually build emotional connection.
  • The Index is a lagging indicator. By the time your score drops, the customer behaviour that caused it has already happened. You need leading signals alongside it.
  • CX scores and business outcomes only correlate when the organisation is structured to act on what the data reveals, not just report on it.

What Does the Forrester CX Index Actually Measure?

Forrester’s CX Index is built on a three-part framework. It asks whether an experience was effective (did the customer accomplish what they came to do), easy (how much effort did it require), and emotionally positive (how did the interaction make them feel). Those three dimensions combine into a single score, but the weighting matters. Emotion consistently accounts for a disproportionate share of the variance in loyalty outcomes in Forrester’s modelling.

That framing is not accidental. Forrester has published work over the years arguing that emotional connection is something customer service organisations can deliberately build, including through technology choices. The implication is that emotion is not a soft, unmanageable output. It is something organisations can engineer, if they choose to.

The Index covers industries including banking, insurance, retail, airlines, hotels, and utilities. Within each sector, brands are ranked against each other. Scores are categorised broadly from poor through to excellent, though in practice most large brands land in the middle band. That clustering is worth noting because it means the gap between average and genuinely good CX is not as wide as most organisations assume. It is achievable. Most companies just do not have the internal structures to close it.

Why Emotion Scores Higher Than Efficiency

I spent years working with clients who were obsessed with reducing friction. Shorter forms, faster load times, fewer steps to checkout. All of that matters. But I watched companies optimise their way to perfectly frictionless experiences that customers still did not particularly like. The process was smooth. The feeling was cold.

Forrester’s framework captures something that pure efficiency metrics miss. A customer can complete a task quickly and still leave feeling like the company did not care about them. That emotional residue is what drives the decision to renew, recommend, or quietly switch. Ease is a hygiene factor. Emotion is a differentiator.

This is not an argument against operational efficiency. Reducing customer effort is genuinely important, and the right CX tools can surface where friction is costing you at scale. But efficiency without warmth produces experiences that are forgettable at best and alienating at worst. The brands that score consistently well on the Forrester Index tend to be the ones that have figured out how to be both fast and human.

If you want a broader grounding in how CX measurement fits together as a discipline, the Customer Experience hub on The Marketing Juice covers the full landscape, from frameworks to metrics to the commercial case for taking it seriously.

Where Most Brands Go Wrong With the Index

The most common mistake I see is treating the Forrester CX Index as a PR asset rather than a diagnostic tool. Brands celebrate when they move up a few points. They issue press releases. They put the ranking in their investor decks. And then they do almost nothing structurally different to sustain it.

The Index is a lagging indicator by design. It captures how customers felt about their experiences over the past year. By the time your score reflects a problem, the customers who had that problem have already made decisions about whether to stay or leave. If you are using the Index as your primary CX signal, you are managing in the rearview mirror.

The smarter approach is to use it as a calibration point alongside faster-moving internal signals. Your post-interaction surveys, your customer satisfaction tracking, your contact centre data, your repeat purchase patterns. The Forrester Index tells you where you stand in the market. Your internal data tells you why, and what to do about it.

There is also a structural issue that the Index cannot fix. I have worked with businesses where the CX data was genuinely good, the team understood what it was saying, and nothing changed because the people with budget authority did not see customer experience as their problem. The Index can tell you that your emotional scores are dragging down your overall ranking. It cannot make your CFO care about that.

The Gap Between Insight and Action

When I was running an agency and we were working across thirty-plus industries, one pattern repeated itself regardless of sector. The organisations with the best CX data were not always the ones with the best customer experiences. The ones with the best experiences were the ones where someone senior was personally accountable for acting on what the data said.

Forrester has written about this. Their research consistently points to a gap between what companies measure and what they change. Most large organisations have more CX data than they know what to do with. The problem is not measurement. It is governance. Who owns the score? Who has the authority and the budget to fix the things that are dragging it down? In most businesses, that question does not have a clean answer.

A well-structured customer experience dashboard can help surface the right signals to the right people at the right time. But the dashboard is only as useful as the decision-making process it feeds into. If the data goes into a slide deck that gets presented quarterly to a leadership team that has no CX-specific OKRs, the data is decorative.

The organisations that move their Forrester scores meaningfully over time tend to share a few characteristics. CX is someone’s primary job, not a secondary responsibility. There is a clear feedback loop from customer data to product and service decisions. And the people closest to customers, front-line staff, contact centre agents, account managers, have both the tools and the authority to resolve issues without escalating everything.

What the Forrester Index Reveals About Industry Norms

One of the more useful things about the Index is that it lets you benchmark against your actual competitive set, not some abstract ideal. And when you look at the data by sector, the picture is often uncomfortable.

Industries like airlines and cable providers consistently score near the bottom. Not because they do not know their CX is poor, but because the competitive dynamics of those industries have historically made it possible to retain customers despite poor experiences. When switching costs are high and alternatives are limited, companies can survive mediocre CX. The Forrester Index documents this clearly. It does not fix it.

What the Index does do is create a reference point for what good looks like in your sector. If the best-in-class score in your industry is 72 and you are at 65, you know the ceiling. You also know that closing that gap is achievable because a competitor has already done it. That is a more honest conversation to have internally than “we need to improve customer experience,” which means nothing without a benchmark.

I have sat in enough agency pitches and client strategy sessions to know that most CX improvement programmes start with ambition and no baseline. The Forrester Index provides the baseline. Whether the organisation does anything with it is a different question.

Human Interaction Still Drives Emotional Scores

There is a consistent finding across Forrester’s CX research that most technology-led transformation programmes quietly ignore. Human interactions, when they happen, drive emotional scores more than digital interactions do. A well-handled phone call or a thoughtful email from a real person can move a customer’s perception of a brand more than a redesigned app ever will.

This does not mean digital experience does not matter. It matters enormously for effectiveness and ease. But the emotional dimension of the Forrester framework is often won or lost in human moments. The agent who actually listened. The representative who resolved the problem without making the customer repeat themselves three times. The follow-up that nobody asked for but someone sent anyway.

Tools like positive scripting in customer service are one way organisations try to systematise this, though there is a real risk that scripted warmth reads as scripted warmth. Customers are not naive. They can tell the difference between a company that has trained its agents to sound caring and a company that has built a culture where agents actually are. The Forrester scores reflect that distinction.

Video is an area where some organisations are experimenting with bringing more human quality into digital interactions. Personalised video in customer support is one approach that attempts to bridge the gap between digital efficiency and human connection. Whether it works depends entirely on execution. A video message that feels genuine can be powerful. One that feels like a marketing asset dressed up as support is worse than a plain text email.

Using the Forrester CX Index Commercially

If you are a senior marketer, the most useful thing you can do with the Forrester CX Index is connect it to commercial outcomes. Not as an abstract correlation, but as a specific business case for investment.

Forrester’s own analysis has consistently shown that brands with higher CX Index scores tend to outperform lower-scoring competitors on revenue growth and customer retention over time. That relationship is not mechanical, and it does not mean CX improvements automatically produce revenue. But it does mean there is a defensible commercial argument for treating CX as a growth lever rather than a cost centre.

The argument I used to make to clients, and still make now, is this: if a company genuinely delighted customers at every opportunity, it would drive growth more reliably than most marketing programmes. Marketing is often a blunt instrument used to prop up businesses with more fundamental problems. Acquisition spend can mask churn. Promotional activity can mask a product that people do not actually love. The Forrester Index is one of the cleaner ways to see through that noise and understand what customers actually think of the experience, separate from what your campaigns are promising them.

Pairing the Index with your own customer experience analytics gives you a more complete picture. The Index tells you how you compare to the market. Your internal analytics tell you which specific touchpoints are driving your score up or down, and where investment will have the most impact.

There is more on the commercial case for CX investment across the full Customer Experience section of The Marketing Juice, including how to build measurement frameworks that connect experience metrics to business outcomes rather than treating them as separate disciplines.

What a Good CX Programme Looks Like Alongside the Index

The Forrester CX Index is most valuable when it is one input among several, not the centrepiece of a CX programme. Here is what the organisations that use it well tend to have in place.

First, they have a clear owner. Not a committee, not a shared responsibility across three departments. One person or team who is accountable for the score and has the authority to influence the things that drive it.

Second, they have faster-moving internal signals that they track continuously. Post-interaction surveys, contact centre resolution rates, digital behaviour data, repeat purchase patterns. These move in near real-time and give the organisation early warning before the annual Forrester data reflects a problem.

Third, they have a structured process for turning insight into action. Not a slide deck. An actual decision-making process with budget, timelines, and accountability. I have seen too many organisations where the CX team produces excellent analysis and then watches it disappear into a leadership review with no follow-through.

Fourth, they treat front-line staff as a strategic asset. The people who interact with customers every day know things that no survey will ever capture. The best CX programmes have formal mechanisms for surfacing that knowledge and acting on it. The worst ones treat front-line staff as a cost to be managed and then wonder why their emotional scores are poor.

The Forrester CX Index is a rigorous, credible external benchmark. Used well, it sharpens internal conversations and provides a market-level reference point that is hard to argue with. Used poorly, it becomes another annual exercise in producing numbers that nobody changes their behaviour in response to. The difference is not the data. It is what the organisation is structured to do with it.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What is the Forrester Customer Experience Index?
The Forrester Customer Experience Index is an annual benchmarking study that measures how well large brands deliver experiences across three dimensions: effectiveness, ease, and emotional quality. It draws on direct consumer surveys across multiple industries and produces scores that allow brands to compare their CX performance against competitors in the same sector.
Why does Forrester weight emotion so heavily in its CX framework?
Forrester’s research consistently shows that how a customer feels during and after an interaction is a stronger predictor of loyalty behaviour than whether the interaction was fast or easy. A customer can complete a task efficiently and still leave with a negative emotional impression of the brand. Emotion drives repeat purchase, advocacy, and retention in ways that efficiency metrics alone do not capture.
How often is the Forrester CX Index published?
The Forrester CX Index is published annually. It reflects consumer perceptions gathered over the preceding year, which means it is a lagging indicator. Brands should use it alongside faster-moving internal metrics rather than relying on it as their primary signal for real-time CX performance.
Which industries does the Forrester CX Index cover?
The Index covers a broad range of consumer-facing industries including banking, insurance, retail, airlines, hotels, and utilities. Coverage can vary by edition, but the study is designed to allow within-industry comparison, so brands are benchmarked against direct competitors rather than across unrelated sectors.
How should marketers use the Forrester CX Index alongside internal metrics?
The Forrester CX Index provides an external market benchmark. Internal metrics such as post-interaction satisfaction scores, contact centre resolution rates, and customer retention data provide the operational detail behind the score. Used together, they give organisations both a competitive reference point and the diagnostic information needed to act on it. The Index tells you where you stand. Your internal data tells you why.

Similar Posts