Strategy Planning Examples That Changed How I Think About Growth
Strategy planning examples are most useful when they show the thinking behind the decision, not just the outcome. The frameworks matter less than the judgment calls that sit underneath them: what problem are we actually solving, who are we trying to reach, and what has to be true for this to work.
After 20 years running agencies, managing hundreds of millions in ad spend across 30 industries, and watching strategies succeed and fail in roughly equal measure, I’ve come to believe that most strategy planning fails not because of poor execution but because the framing was wrong from the start.
Key Takeaways
- Good strategy planning starts with the right problem, not the right framework. Most plans fail because they answer the wrong question with confidence.
- Lower-funnel performance metrics often overstate their own contribution. Demand creation requires reaching people who weren’t already looking for you.
- The best strategy examples share one trait: they made a clear choice about who they were not targeting, not just who they were.
- Planning cycles that skip the commercial context (margin, LTV, payback period) produce activity plans, not growth strategies.
- Strategy is most valuable when it creates alignment across teams, not just a document that gets filed after the offsite.
In This Article
- Why Most Strategy Planning Examples Miss the Point
- Example 1: The Agency That Was Winning the Wrong Clients
- Example 2: The Performance Marketing Trap
- Example 3: The Go-To-Market Plan That Started With the Wrong Assumption
- Example 4: The Whiteboard Moment and What It Taught Me About Preparation
- Example 5: Scaling a Team Without Losing Strategic Clarity
- What Good Strategy Planning Actually Looks Like in Practice
- The Planning Mistakes I See Most Often
- How to Build a Strategy Planning Process That Holds Up
Why Most Strategy Planning Examples Miss the Point
The internet is full of strategy planning examples that show you the finished artifact: a neat 2×2, a prioritised roadmap, a channel mix with percentages allocated. What they rarely show you is the room where those decisions were made, the assumptions that were challenged, and the options that were discarded.
I’ve sat in hundreds of strategy planning sessions. The ones that produced something useful shared a quality that had nothing to do with the template used. Someone in the room was willing to say “we’re not actually sure this is the right problem.” That moment of intellectual honesty, often uncomfortable, is where real strategy begins.
The ones that produced polished decks and poor results had a different quality: everyone knew what the answer was supposed to be before the session started. The planning process was theatre. The strategy was already written; it just needed to be validated.
If you want to build strategies that hold up under commercial pressure, the go-to-market and growth strategy thinking on this site covers the underlying principles in more depth. But let’s get into the examples.
Example 1: The Agency That Was Winning the Wrong Clients
When I joined iProspect as managing director, the agency was technically growing. New business was coming in. Headcount was rising. But the P&L told a different story. Margins were thin, the team was stretched, and a handful of clients were consuming resources that made no commercial sense.
The strategy planning process that followed wasn’t about finding new growth levers. It was about getting honest about which clients we should be pursuing in the first place. We mapped every client against two dimensions: revenue contribution and margin contribution. They were not the same list. Some of our highest-revenue clients were our least profitable once you accounted for service hours, revisions, and the cost of keeping difficult relationships intact.
The output of that planning process was an ideal client profile that was genuinely restrictive. We defined the sectors, deal sizes, and commercial structures where we could win and deliver well. We stopped pitching outside those boundaries. New business conversion improved because we were pitching less and winning more. Margin improved because we weren’t discounting to win work we shouldn’t have been chasing.
This is a strategy planning example that looks simple on paper. In practice, it required turning down pitches, having honest conversations with the team about why we weren’t going after certain opportunities, and trusting that focus would compound over time. It did. Within two years, the agency moved from loss-making to one of the top-performing in the network.
The lesson: strategy planning that doesn’t include a clear articulation of what you’re not doing is usually just a wish list.
Example 2: The Performance Marketing Trap
Earlier in my career, I overvalued lower-funnel performance. I was good at it. Clients loved the attribution dashboards. Every pound spent had a reported return. It looked like accountability.
The problem was that much of what performance marketing was being credited for was going to happen anyway. Someone who searches for your brand name was probably going to buy from you. Someone who clicks a retargeting ad for a product they already put in their basket was already in the purchase funnel. The paid channel got the last click. The conversion happened. The credit was assigned. The strategy looked like it was working.
What wasn’t happening was growth in the addressable audience. We were getting very efficient at capturing the demand that already existed. We weren’t creating new demand. Think about a clothes shop: someone who tries something on is far more likely to buy than someone walking past the window. But if you only ever optimise for people already in the changing room, you stop thinking about how to get people through the door in the first place.
The strategy planning shift that made the biggest difference was introducing a framework that separated demand capture from demand creation, and allocating budget deliberately across both. This isn’t a new idea. Forrester’s intelligent growth model has been making a version of this argument for years. But it required a genuine change in how we structured planning conversations, because performance teams naturally gravitate toward what they can measure, and what they can measure is mostly downstream.
The planning example here: map your current budget against where in the purchase experience it operates. If more than 70% of your spend is capturing existing intent, your strategy is optimised for efficiency, not growth. Those are different objectives and they require different plans.
Example 3: The Go-To-Market Plan That Started With the Wrong Assumption
I worked with a financial services client who had built a detailed go-to-market plan for a new product. The plan was thorough. The channel strategy was well-reasoned. The media budget was sized appropriately. The problem was buried in the first line of the brief: “our target audience is existing customers who are ready to upgrade.”
That assumption had never been tested. It was inherited from the previous product launch, which had a different product, a different competitive context, and a different customer base. When we ran a proper audience analysis, the highest-value opportunity was actually a segment that had no relationship with the brand at all. The existing customer base was smaller than assumed, more price-sensitive than assumed, and already being targeted by three competing products from the same parent company.
The go-to-market plan had to be rebuilt from the audience insight outward, not from the product brief inward. BCG’s work on financial services go-to-market strategy makes a similar point about the danger of assuming you know who your customer is before you’ve looked at the data properly.
The practical output was a segmented launch plan: a retention play for existing customers (smaller budget, higher conversion rate, shorter sales cycle) and an acquisition play for the new segment (larger budget, longer payback period, different channel mix). Both were tracked separately against different success metrics. The plan was more complex, but it was honest about the different jobs it was doing.
This is a strategy planning example worth holding onto: the most dangerous assumption in any plan is the one that was never written down because everyone already agreed on it.
Example 4: The Whiteboard Moment and What It Taught Me About Preparation
Early in my career at Cybercom, I was in a brainstorm for Guinness. The founder had to leave mid-session for a client meeting and handed me the whiteboard pen on his way out. My internal reaction was not confidence. It was something closer to controlled panic. But I did it anyway.
What I remember most clearly is that the session only produced anything useful because I’d been in enough planning conversations to know the questions worth asking. Not the answers. The questions. What does this brand stand for that no other brand can credibly own? Who is the most valuable person we’re not currently reaching? What would have to be true for this idea to work at scale?
Strategy planning, at its best, is a structured way of asking better questions. The whiteboard, the framework, the template , these are just tools for organising the conversation. The quality of the output depends entirely on the quality of the thinking that goes into it.
That experience shaped how I run planning sessions now. I always start with the problem statement and ask whether everyone in the room genuinely agrees on what it is. They usually don’t. That disagreement, surfaced early, is more valuable than any framework applied later.
Example 5: Scaling a Team Without Losing Strategic Clarity
Growing an agency team from 20 to 100 people is a strategy planning challenge as much as an operational one. The risk isn’t that you hire badly. The risk is that the strategy becomes diluted as more people interpret it differently, and no one notices until the culture has drifted and the client work has followed.
The planning approach that worked was treating strategic clarity as an operational requirement, not a one-time offsite output. Every quarter, we ran a session that asked the same three questions: what are we best at, who is our best client, and what would we stop doing if we had to choose. The answers changed as the business evolved, and that was fine. What mattered was that everyone in a leadership position could answer those questions consistently.
BCG’s research on scaling agile organisations points to strategic alignment as one of the factors most commonly underestimated during growth phases. In my experience, that’s accurate. The planning infrastructure that works for a 20-person team breaks down at 60 people not because the strategy was wrong but because the communication of it hadn’t scaled.
The practical example: if you’re scaling a team and you don’t have a documented, regularly reviewed strategic framework that every senior hire is onboarded into, you’re not scaling strategy. You’re scaling activity and hoping it stays coherent.
What Good Strategy Planning Actually Looks Like in Practice
Having judged the Effie Awards, I’ve reviewed a large number of marketing strategies that produced measurable business results. The ones that held up under scrutiny shared structural characteristics worth noting.
First, they were clear about the commercial objective. Not the marketing objective. The business objective. Revenue, margin, market share, customer lifetime value. The marketing strategy was built to serve a commercial outcome, not the other way around.
Second, they had made explicit choices about audience. Not “adults 25-54” but a specific description of the person most likely to generate the required return, and why that person was reachable through the proposed channels.
Third, they had a theory of change. Not just “we will run these activities and measure these metrics” but a causal argument: if we reach this audience with this message through these channels, the following thing will happen in their behaviour, which will produce the following commercial outcome. That causal chain can be wrong, and it often is. But having it written down means you can test it, learn from it, and improve it.
Many teams skip the causal chain because it’s uncomfortable to commit to. If you don’t write it down, you can’t be wrong. But you also can’t learn. The increasing complexity of go-to-market execution makes this kind of rigour more important, not less, because there are more variables to manage and more places for the logic to break down.
The Planning Mistakes I See Most Often
Across the industries and organisations I’ve worked with, the same planning failures recur with enough regularity that they’re worth naming directly.
Confusing a budget plan with a strategy. A spreadsheet that allocates spend across channels is not a strategy. It’s a budget. Strategy is the reasoning that justifies those allocations. Many organisations produce the budget and skip the reasoning.
Planning in silos. Brand teams plan separately from performance teams. Digital plans separately from CRM. Each team optimises for its own metrics and the overall commercial outcome is nobody’s explicit responsibility. I’ve seen this in organisations spending tens of millions on marketing, where no single person could tell you how the combined plan was expected to produce growth.
Treating the plan as the output. The plan is not the output. The business result is the output. Plans that aren’t reviewed, stress-tested, and updated as conditions change are just documents. The discipline of returning to the plan, asking what we assumed and whether those assumptions are holding, is where strategy planning creates real value.
Overcomplicating the framework. I’ve seen planning processes that take three months, involve 15 workshops, and produce a 90-slide deck that no one reads after the board presentation. The best strategic plans I’ve worked with fit on two pages. Not because the thinking was shallow, but because the discipline of simplification forced clarity about what actually mattered.
If you want a broader view of how these planning principles connect to execution across growth stages, the go-to-market and growth strategy hub covers the full territory, from audience development through to channel strategy and commercial measurement.
How to Build a Strategy Planning Process That Holds Up
Based on what I’ve seen work across agency and client-side environments, here is a planning structure that produces usable strategy rather than impressive documentation.
Start with the commercial context. What does the business need to achieve in the next 12 months? Not what marketing wants to do. What the business needs. Revenue targets, margin targets, customer acquisition targets. These are your constraints and your brief.
Define the audience with specificity. Who is the highest-value person you’re trying to reach? What do they currently believe? What would have to change in their thinking or behaviour for the commercial objective to be met? This is harder than it sounds and most teams skip it in favour of broad demographic targeting.
Map your current position honestly. Where are you strong? Where are you weak? What are competitors doing that’s working? What’s the gap between where you are and where you need to be? Looking at how other brands have approached growth challenges can provide useful reference points here, though the goal is to understand the principle, not copy the tactic.
Make the strategic choices explicit. Which audience segments are you prioritising and why? Which channels will you invest in and which will you deprioritise? What is the message hierarchy? These choices should be written down and agreed before anyone starts building campaign plans.
Write the causal chain. If we do X, Y will happen, which will produce Z. Keep it simple. Three or four links in the chain is enough. If you can’t articulate it simply, the strategy isn’t clear enough yet.
Define success in advance. What metrics will tell you the strategy is working? What metrics will tell you it isn’t? Set the thresholds before you start, not after you see the results. This is where most measurement frameworks fall down: the goalposts move after the fact to accommodate whatever happened.
Build in review points. Quarterly at minimum. Return to the plan, test the assumptions, update what needs updating. Strategy is not a one-time event. It’s an ongoing discipline.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
