Customer Journey Design: Stop Mapping and Start Engineering
Customer experience design is the process of intentionally shaping how customers move from first awareness through to purchase, loyalty, and advocacy, with each stage deliberately constructed rather than left to chance. Most brands have a customer experience. Very few have designed one.
The difference matters more than most marketing teams want to admit. A experience that happens organically is full of gaps, contradictions, and moments where customers quietly decide to leave. A designed one closes those gaps before they cost you revenue.
Key Takeaways
- Most customer journeys are described after the fact, not designed before it. Mapping what exists is not the same as engineering what should exist.
- The biggest revenue losses in most businesses happen between stages, not within them. Fixing handoffs between touchpoints often outperforms optimising individual channels.
- Customer experience design is a commercial discipline, not a creative exercise. Every stage should have a measurable outcome attached to it.
- Technology enables experience design but does not replace it. Automation built on a broken experience just accelerates the failure.
- The brands that grow without heavy marketing spend tend to have one thing in common: their post-purchase experience is as deliberate as their acquisition strategy.
In This Article
- Why Most experience Maps End Up on a Wall and Nowhere Else
- The Commercial Case for Engineering Over Mapping
- What a Designed experience Actually Looks Like
- Where Technology Fits and Where It Does Not
- The Post-Purchase Stage Most Brands Underinvest In
- How to Run a experience Design Sprint That Produces Something Useful
- Measuring experience Performance Without Drowning in Data
Why Most experience Maps End Up on a Wall and Nowhere Else
I have been in a lot of workshops where a team spends two days producing a beautiful customer experience map. It gets printed, laminated, pinned to a wall in the marketing department, and then systematically ignored. Within six months, nobody references it. Within a year, it is quietly taken down to make room for something else.
The problem is not the map. The problem is that mapping is treated as the output rather than the starting point. Teams document the experience as it currently exists, feel good about the process, and then return to running campaigns that do not reflect what they just learned. The exercise becomes theatre.
Genuine experience design works differently. It starts with the same mapping process but treats the current state as a problem to be solved, not a picture to be framed. Every gap, every inconsistency, every moment where a customer has to work harder than they should, becomes an engineering brief. The map is the diagnosis. The design is the treatment.
If you want a grounded starting point for thinking about what good experience design looks like in practice, Crazy Egg’s breakdown of customer experience fundamentals covers the structural basics clearly without overcomplicating them.
The Commercial Case for Engineering Over Mapping
Early in my agency career, I worked with a retailer that was spending heavily on paid search to drive traffic to its site. Conversion rates were poor and the instinct, as it usually is, was to optimise the ads. We dug into the actual customer experience instead. The experience from ad click to purchase had eleven steps, three of which required customers to re-enter information they had already provided. The abandonment was not a paid search problem. It was a experience design problem.
Fixing the experience, not the ads, moved the needle. We did not change the targeting, the creative, or the bids. We changed what happened after the click. Revenue improved within weeks. The lesson stayed with me: marketing spend is often being used to compensate for a broken experience downstream.
This is why experience design has to be treated as a commercial discipline. Every stage of the experience should have a clear business objective, a measurable outcome, and an owner. Without those three things, you have a conceptual framework, not an operational system. Mailchimp’s guide to end-to-end customer journeys makes a useful point about thinking in terms of the full arc rather than individual touchpoints in isolation, which is where most teams get stuck.
The broader discipline of customer experience strategy, including how experience design sits within it, is something I write about across the Customer Experience hub here at The Marketing Juice. experience design is one piece of a larger picture, and it works best when it is connected to the wider CX infrastructure rather than treated as a standalone project.
What a Designed experience Actually Looks Like
A well-designed customer experience has five characteristics that distinguish it from a documented one.
It is built around customer intent, not internal process. Most experience maps reflect how a company is organised internally: marketing owns awareness, sales owns conversion, customer service owns retention. Customers do not experience your org chart. They experience a sequence of moments, and those moments need to be coherent regardless of which internal team owns them. A designed experience is built from the outside in, starting with what the customer is trying to do at each stage and working backwards to what the business needs to provide.
It has explicit transition logic between stages. The most common place customers are lost is not within a stage but between them. The move from awareness to consideration, from consideration to purchase, from purchase to repeat purchase: each of these transitions is a moment of potential failure. A designed experience specifies exactly what needs to happen to move a customer from one stage to the next, who is responsible for making it happen, and what signal indicates the transition has occurred.
It accounts for customers who are not ready. Most experience design focuses on the ideal path: customer becomes aware, considers options, purchases, becomes loyal. Real customers rarely follow that path without deviation. Some need more time. Some need more information. Some will drop out and come back. A designed experience has explicit re-engagement logic built in, not bolted on as an afterthought. SMS as a re-engagement channel is one example of how brands are building that logic into their experience architecture, particularly for customers who have gone quiet after initial interest.
It connects emotional and functional needs at each stage. Customers are not just trying to complete a transaction. They are also managing anxiety, building confidence, and forming opinions about whether they trust you. A designed experience acknowledges both dimensions. The functional need at the consideration stage might be product information. The emotional need is reassurance. A experience that only addresses the functional need will lose customers who have the information but not the confidence.
It is owned, not just described. Every stage of the experience has a named owner who is accountable for its performance. Not a team, not a department: a person. This is where most experience design efforts collapse. The map gets produced, the insights are agreed, and then accountability diffuses across the organisation until nobody is responsible for anything specific. Engineering a experience means assigning ownership with the same rigour you would apply to any other business process.
Where Technology Fits and Where It Does Not
The marketing technology industry has done a thorough job of convincing brands that experience design is primarily a technology problem. Buy the right platform, connect your data sources, build your automation flows, and the experience will take care of itself. I have managed enough technology implementations to know that this is not how it works.
Technology is an enabler of experience design, not a substitute for it. Automation built on top of a poorly designed experience does not fix the experience. It just executes the broken version faster and at greater scale. I have seen brands spend six figures on marketing automation platforms and then wonder why their customer experience scores have not improved. The platform was fine. The experience it was automating was not.
The sequence matters. Design the experience first. Identify the moments that need to be engineered. Define the logic that should govern each transition. Then ask which parts of that logic can be automated and which require human judgment. Technology answers the “how do we execute this at scale” question. It cannot answer the “what should we be doing” question.
Optimizely’s thinking on digital optimisation across the full customer experience is worth reading for teams that are at the point of connecting experience design to their digital infrastructure. The framing around optimising the whole arc rather than individual channels in isolation aligns with how I think about this.
There is also a useful role for AI tools in experience analysis. Moz’s Whiteboard Friday on using ChatGPT for customer experience thinking is a practical demonstration of how AI can help teams interrogate their experience assumptions and identify gaps they might not have spotted manually. It is not a replacement for the design work, but it is a genuinely useful analytical layer.
The Post-Purchase Stage Most Brands Underinvest In
When I was running agency teams, I noticed a consistent pattern across clients in different sectors. The pre-purchase stages of the experience received the most attention, the most budget, and the most creative energy. The post-purchase stages received almost none. Onboarding was functional at best. Follow-up communication was generic. Loyalty programmes were bolted on rather than built in.
The commercial logic of this is hard to defend. Acquiring a new customer costs significantly more than retaining an existing one. The customers most likely to become advocates are those who have already purchased and had a good experience. The post-purchase stage is where the relationship either deepens or quietly dissolves, and most brands are designing it with a fraction of the care they put into the acquisition stage.
This is where the connection between experience design and genuine business performance becomes most visible. If a company truly delighted customers at every opportunity after purchase, that alone would drive growth through retention, referral, and repeat purchase. Marketing spend would be doing less heavy lifting because the product and experience would be generating their own momentum. Most marketing budgets are, at least in part, compensating for post-purchase experiences that do not deliver on the promise made during acquisition.
Designing the post-purchase stage properly means treating it with the same rigour as the acquisition stage. What does the customer need to know immediately after purchase to feel confident in their decision? What is the first meaningful value moment, and how quickly can you deliver it? What does the re-engagement trigger look like for customers who have gone quiet? These are engineering questions, not afterthoughts.
Building a feedback culture into the post-purchase stage is also worth the investment. HubSpot’s thinking on customer feedback culture is useful here, particularly the argument that feedback should be structural rather than episodic. Brands that only ask for feedback when something goes wrong are designing a reactive system. Brands that build feedback into the experience at regular intervals are designing a learning system.
How to Run a experience Design Sprint That Produces Something Useful
The workshop format I have found most effective for experience design is not a two-day offsite with sticky notes. It is a series of shorter, more focused sessions, each with a specific output and a clear decision that needs to be made by the end of it.
Session one is diagnostic. The goal is to map the current experience from the customer’s perspective, not the company’s. This means using actual customer data: support tickets, session recordings, survey responses, complaint logs. Not assumptions about what customers experience, but evidence of what they actually do. The output is a current-state map with the friction points identified and quantified where possible.
Session two is prioritisation. Not every friction point can be fixed simultaneously, and not every fix will have the same commercial impact. The output of this session is a ranked list of the five to ten moments in the experience that, if improved, would have the greatest effect on the metrics that matter: conversion, retention, lifetime value, referral rate.
Session three is design. For each priority moment, the team defines what the ideal experience looks like, what needs to change to deliver it, who owns the change, and how success will be measured. This is where the experience moves from a documented state to an engineered one. The output is a set of design briefs, not a new map.
Session four is implementation planning. Each design brief gets a timeline, a budget estimate, and an owner. The experience design becomes a project plan. This is the step most teams skip, which is why most experience design work never translates into anything that changes the customer’s experience.
The customer service data that surfaces during this process often reveals more about experience failures than any analytics platform will. HubSpot’s customer service research consistently shows that the gap between what customers expect and what they receive is largest at the moments brands have paid the least attention to designing. That gap is not a customer service problem. It is a experience design problem.
Measuring experience Performance Without Drowning in Data
One of the more common traps in experience design is building a measurement framework so complex that nobody uses it. I have seen dashboards with forty metrics tracking every micro-moment of the customer experience, and I have seen the same teams unable to answer the basic question of whether their experience is performing better or worse than it was six months ago.
experience measurement should start with three questions. First, are customers moving through the experience at the rate we expect? This is a flow question, answered by conversion rates at each stage transition. Second, are customers who complete the experience behaving in the ways we want? This is an outcome question, answered by retention rates, repeat purchase rates, and referral rates. Third, are customers telling us the experience felt good? This is an experience question, answered by satisfaction scores at key moments, not just at the end.
Those three questions, tracked consistently over time, will tell you more about experience performance than forty metrics tracked inconsistently. The goal is honest approximation, not false precision. If your stage-two to stage-three conversion rate is improving quarter on quarter, you are making progress. You do not need to know exactly why to know that the direction is right.
When the metrics do flag a problem, the investigation should start with the experience design, not the marketing channels. A drop in conversion between consideration and purchase is more likely to be a experience friction issue than a targeting issue. A drop in retention after first purchase is more likely to be an onboarding issue than a loyalty programme issue. The instinct to solve experience problems with more marketing spend is understandable, but it is usually expensive and temporary.
If you want to go deeper on the full scope of customer experience strategy and how experience design connects to broader commercial outcomes, the Customer Experience hub covers the wider discipline in detail. experience design does not exist in isolation, and understanding its relationship to measurement, team structure, and channel strategy is what separates brands that improve their CX from those that just talk about improving it.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
