Customer Insight Strategy: Stop Researching, Start Deciding
A customer insight strategy is a structured approach to gathering, analysing, and acting on what you know about your customers, so that commercial decisions are grounded in reality rather than assumption. Done well, it connects what customers actually think, feel, and do to the choices your business makes about positioning, messaging, product, and growth.
Most companies do not have one. They have data. They have surveys. They have a CRM full of behavioural signals nobody has looked at in six months. That is not a strategy. That is a filing system with good intentions.
Key Takeaways
- Customer insight is only valuable when it informs a decision. Insight that sits in a deck and goes nowhere is expensive market research theatre.
- The most common failure is collecting too much data and acting on too little. Narrow your insight questions to the decisions you actually need to make.
- Qualitative and quantitative methods answer different questions. Using only one produces a distorted picture of your customer.
- Insight strategy requires an owner and a cadence. Without both, it becomes a periodic project rather than an operating discipline.
- The companies that grow consistently are the ones that genuinely understand why customers choose them, and build everything around that.
In This Article
- Why Most Insight Work Produces Nothing Useful
- What a Customer Insight Strategy Actually Contains
- The Difference Between Data and Insight
- Qualitative Methods That Actually Work
- Quantitative Methods Worth Using
- Segmentation: The Output That Changes the Most
- How Insight Connects to Positioning and Messaging
- The Organisational Problem Nobody Talks About
- Building an Insight Cadence That Holds
- The Quiet Competitive Advantage
Why Most Insight Work Produces Nothing Useful
Early in my career, I sat through a research debrief for a major brand. The agency had spent three months and a significant budget producing a report that was, by any measure, comprehensive. It covered demographics, attitudes, category behaviours, brand perceptions, and purchase drivers. It was 140 slides long. The marketing director flicked through it, said “interesting”, and went back to approving creative.
Nothing changed. The campaign that launched six weeks later could have been built without the research. Probably was.
This is the central problem with how most organisations approach customer insight. The research is commissioned to feel thorough, not to answer a specific question. When there is no specific question, there is no decision the insight is meant to inform. And when there is no decision attached to it, insight has nowhere to go except a shared drive.
The fix is not better research methodology. It is starting with the decision, not the data. Before any insight work begins, the question should be: what will we do differently depending on what we find? If the answer is “nothing much”, do not do the research. If the answer is clear, you have a brief worth executing.
What a Customer Insight Strategy Actually Contains
An insight strategy is not a research plan. A research plan tells you what methods you will use and when. An insight strategy tells you what questions matter most to the business, how you will answer them systematically, who owns the work, and how findings connect to commercial decisions.
In practice, it has four components.
The first is a prioritised question set. Not an exhaustive list of everything you want to know about your customer, but a short list of the questions that, if answered, would change how you operate. These typically cluster around three areas: why customers choose you over alternatives, what drives retention and churn, and where unmet needs exist that you could credibly address.
The second is a method mix. Qualitative methods, including customer interviews, ethnographic observation, and open-ended surveys, tell you what customers think and why. Quantitative methods, including structured surveys, behavioural analytics, and transaction data, tell you how many and how often. You need both. One without the other gives you either stories without scale or numbers without meaning.
The third is a cadence. Some insight work is continuous, like monitoring NPS trends or tracking search behaviour. Some is periodic, like running a customer segmentation refresh every 18 months. Some is triggered by events, like a product launch or a sudden churn spike. A strategy maps all three and assigns ownership so insight generation does not depend on someone remembering to commission it.
The fourth is an activation mechanism. This is the piece most organisations skip. Insight needs a defined route into decision-making. That might be a monthly review where findings are presented to the leadership team alongside a recommendation. It might be a standing item in the quarterly planning cycle. The format matters less than the discipline. Without it, insight stays in research land and never reaches the people making commercial decisions.
If you are building or refining your broader go-to-market approach, the Go-To-Market and Growth Strategy hub covers how insight connects to positioning, channel selection, and commercial planning across the full growth cycle.
The Difference Between Data and Insight
I have seen this confusion cause real damage in organisations. A business collects a lot of data, assumes it has insight, and makes decisions based on what the data appears to show rather than what it actually means.
Data is a fact. Insight is an interpretation that implies an action. “Our average customer is 38 years old” is data. “Our customers in the 35 to 45 bracket have a 40% higher lifetime value but we are spending almost nothing acquiring them because our media targeting skews younger” is insight. The first is interesting. The second changes a budget decision.
The gap between data and insight is analysis, and analysis requires someone with enough commercial context to know what a finding means for the business. This is why insight functions that sit entirely within analytics teams, disconnected from marketing and commercial strategy, tend to produce reports rather than recommendations. The people closest to the data are not always the people who understand the business implications.
When I was running agencies, the best insight work I saw came from teams that combined a rigorous researcher with a strategist who understood the client’s commercial pressures. The researcher kept the methodology honest. The strategist kept the output useful. Separating those two functions produced worse work than integrating them.
Qualitative Methods That Actually Work
Customer interviews are the most underused tool in B2B and mid-market businesses. Not focus groups. Not surveys with open-text fields. Actual conversations with customers, conducted by someone who knows how to listen without leading.
The reason they are underused is that they feel slow and unscalable. You talk to twelve people and worry that it is not statistically representative. But qualitative research is not trying to be statistically representative. It is trying to surface the reasoning behind behaviour, the language customers use to describe their problems, and the factors they weigh when making decisions. That information does not come from a tick-box survey. It comes from someone saying “tell me more about that” at the right moment.
Twelve well-conducted customer interviews will typically surface three or four themes that appear consistently. Those themes are almost always more useful than a quantitative survey that tells you 67% of customers rate you 8 out of 10 on “overall satisfaction”. The number is clean. The interviews are messy. The interviews are more valuable.
Jobs-to-be-done framing is worth applying here. Rather than asking customers what they think of your product, ask them what problem they were trying to solve when they first looked for a solution. Ask what alternatives they considered. Ask what almost made them choose someone else. The answers reveal competitive positioning gaps that no brand tracking survey will find.
Quantitative Methods Worth Using
Behavioural data is the most honest form of customer insight available to most businesses, because it records what customers do rather than what they say they do. Those two things are frequently different.
Transaction data, website behaviour, email engagement, and product usage patterns all tell you something real about how customers interact with you. The challenge is that behavioural data tells you what happened but rarely tells you why. Someone cancelled their subscription. The data shows when. It does not show the reason. That is where qualitative methods come back in.
Structured surveys are useful when you need to quantify something you have already identified qualitatively. If your customer interviews suggest that speed of response is a major driver of satisfaction, a survey can tell you how many customers feel that way and how it compares to other factors. Using surveys to discover what matters, rather than to measure something you already know matters, tends to produce shallow findings.
Net Promoter Score has its uses, but it is frequently treated as an insight programme when it is actually a metric. A single score tells you the current state of customer sentiment. It does not tell you what is driving it or what to do about it. If you use NPS, pair it with follow-up questions or interviews that explain the score. Otherwise you are measuring the temperature without knowing what is causing the fever.
Understanding how customers behave across channels is part of the broader growth picture. Resources like Vidyard’s analysis of why go-to-market feels harder are useful context for why customer signals are becoming more fragmented and harder to read in a straightforward way.
Segmentation: The Output That Changes the Most
A well-executed customer segmentation is one of the highest-value outputs of an insight strategy, because it forces a business to be specific about who it is actually serving and who it should be serving more of.
Most segmentations I have seen in agency pitches and client briefs are demographic. Age, gender, income, geography. Occasionally psychographic. Rarely behavioural. Almost never need-state based.
Demographic segmentation tells you who your customers are. Need-state segmentation tells you what they want when they interact with you, which is a far more useful basis for marketing decisions. Two customers with identical demographics might have completely different needs at the point of purchase. One is buying on price because they are managing a tight budget. The other is buying on reliability because a previous supplier let them down. The same message will not work for both of them.
When I was growing the iProspect team from around 20 people to over 100, one of the things that changed how we approached client work was getting clearer on the different types of clients we served and what they actually needed from an agency relationship. Some needed performance and accountability. Some needed strategic counsel. Some needed both but at different stages of the engagement. Treating them as a homogenous group produced worse work and worse retention. Segmenting by need-state, even informally, changed how we structured teams and conversations.
The same logic applies to any business with a customer base of meaningful size. Segmentation is not a marketing exercise. It is a commercial one.
How Insight Connects to Positioning and Messaging
There is a direct line between customer insight and the words you put in front of people. Most businesses do not draw it clearly enough.
Positioning is a claim about why a customer should choose you over the alternatives available to them. That claim needs to be grounded in something customers actually care about, expressed in language that resonates with how they think about the problem. Both of those requirements demand insight. You cannot write positioning from inside the building.
One of the most reliable signals that a company’s positioning is disconnected from customer reality is when the language on their website is completely different from the language customers use to describe their own problems. The company talks about “integrated solutions” and “end-to-end capability”. The customer talks about “not having to chase three different suppliers” and “someone who actually picks up the phone”. Those are the same thing, expressed differently. The customer’s version converts better.
Customer interviews are particularly good at surfacing this language gap. When you ask a customer to describe what life was like before they started working with you, and then what changed, you get the raw material for messaging that actually connects. It is not copywriting. It is listening.
BCG’s work on aligning brand and go-to-market strategy makes a related point about the importance of internal and external coherence. Insight is part of what creates that coherence, because it ensures the story you tell externally is grounded in something real rather than something aspirational.
The Organisational Problem Nobody Talks About
Customer insight fails in most organisations not because the research is bad but because the organisation is not structured to use it.
Insight sits in marketing. Decisions get made in product, sales, and the C-suite. The connection between those two worlds is often weak, informal, and dependent on a single person who cares enough to carry the findings across the room. When that person leaves, the institutional knowledge goes with them.
I have seen this in agencies and in client organisations. A researcher produces genuinely useful work. It circulates briefly. It gets referenced in a strategy deck. Then the next campaign gets briefed and nobody checks what the insight said. The research sits in a folder. A new brief goes out. The cycle repeats.
Fixing this is an organisational question as much as a methodological one. Insight needs a home in the business planning process, not just in the marketing function. The simplest version is a quarterly review where customer insight is presented alongside commercial performance data, and the two are discussed together. What did customers tell us this quarter? What does that mean for what we are planning? Who is responsible for acting on it?
Without that structure, insight is a cost centre that produces reports. With it, insight becomes a commercial function that influences decisions. The difference in ROI between those two things is substantial.
This connects to a broader point about how growth-oriented companies organise themselves. The Go-To-Market and Growth Strategy hub covers how the best-performing businesses align insight, planning, and execution into a coherent operating model rather than treating them as separate disciplines.
Building an Insight Cadence That Holds
The practical question for most marketing leaders is not whether to do insight work but how to build a rhythm that survives competing priorities, budget pressure, and the general chaos of running a marketing function.
A workable cadence for a mid-sized business might look like this. Continuous monitoring covers the things you should always be watching: NPS trends, search query data, social listening, support ticket themes, and sales call notes. These do not require a research project. They require someone to look at them regularly and flag what is changing.
Quarterly work covers the structured pieces: a small batch of customer interviews, a review of churn data with the customer success team, and a synthesis of what the continuous signals have been saying. This feeds into quarterly planning.
Annual work covers the bigger questions: a full segmentation review, a competitive positioning audit, and a deeper dive into unmet needs or emerging category shifts. This feeds into annual strategy.
Event-triggered work covers the moments that demand a rapid insight response: a product launch, a significant competitor move, an unexpected churn spike, or a new market entry. These cannot wait for the quarterly cycle.
The cadence does not need to be expensive. A lot of the most valuable insight work I have seen was done with small budgets and genuine curiosity. Six customer interviews a quarter, conducted by a senior marketer who listens well, will produce more useful output than a £50,000 research project that asks the wrong questions.
For teams thinking about how insight connects to scaling decisions and agile operating models, BCG’s work on scaling agile is worth reading alongside this. The principle of building feedback loops into operating rhythms applies directly to how insight should function in a growing business.
The Quiet Competitive Advantage
I come back to something I have believed for most of my career: if a company genuinely understood its customers and built everything around that understanding, it would not need to shout as loudly or spend as much to grow. Most marketing spend is compensating for the gap between what a business thinks customers want and what they actually want. Insight narrows that gap. Narrowing that gap is more commercially valuable than most marketing tactics.
The businesses I have seen grow consistently, across different categories and different economic conditions, tend to share one characteristic. They know their customers with unusual clarity. Not just who they are demographically, but what they are trying to achieve, what frustrates them, what would make them recommend you to someone else. That knowledge shapes everything: product decisions, pricing, messaging, service design, retention strategy.
Companies that grow by acquiring customers faster than they lose them are doing something right at the acquisition end. Companies that grow by keeping customers longer and expanding revenue from existing relationships are doing something right at the insight end. The second type tends to be more profitable and more defensible over time.
The tools available for insight work have improved considerably. SEMrush’s overview of growth tools touches on some of the platforms that now make continuous customer signal monitoring more accessible for teams without large research budgets. The tools are not the strategy. But they reduce the friction between wanting to understand customers and actually doing it.
When I judged the Effie Awards, the entries that stood out were almost always the ones where you could trace a clear line from a specific customer truth to a specific creative or commercial decision. Not “we did research and it informed our thinking”. But “we discovered that our customers felt X, and that changed what we said and how we said it, and here is what happened as a result”. That clarity of connection between insight and outcome is what separates effective marketing from expensive activity.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
