Digital Strategy Starts With a Business Problem, Not a Channel
Developing digital strategy means deciding how digital channels, data, and technology will help a business achieve specific commercial outcomes. It is not a channel plan, a content calendar, or a technology roadmap. It is the logic that connects what a business needs to grow with the digital levers most likely to move it.
Most businesses skip that logic entirely. They start with the channel (“we need to be on TikTok”) or the tool (“we’re investing in a CDP”) and work backwards to justify it. That is not strategy. That is procurement dressed up as planning.
Key Takeaways
- Digital strategy is the logic connecting business problems to digital solutions, not a list of channels or tools.
- Starting with audience behaviour and commercial goals before selecting channels produces more durable strategies than starting with platforms.
- Most digital strategies fail at execution because they lack clear ownership, realistic resource allocation, and measurable milestones.
- The gap between digital strategy and business results is almost always a measurement problem, not a creative or channel problem.
- A strategy that cannot be explained in plain English to a non-marketer is not finished yet.
In This Article
- What Does a Digital Strategy Actually Contain?
- How Do You Start Building a Digital Strategy From Scratch?
- Which Channels Should a Digital Strategy Include?
- How Do You Prioritise When Resources Are Limited?
- What Role Does Data Play in Digital Strategy Development?
- How Do You Build a Measurement Framework That Actually Works?
- What Makes Digital Strategies Fail at the Execution Stage?
- How Do You Keep a Digital Strategy Connected to the Business Over Time?
I have been in rooms where agencies presented 40-slide digital strategies that were, on closer inspection, media plans with a mission statement bolted to the front. I have also seen one-page strategy documents that were sharper than anything produced at twice the cost. The quality of a digital strategy has almost nothing to do with its length and everything to do with whether the people writing it started with the right questions.
What Does a Digital Strategy Actually Contain?
A properly constructed digital strategy contains four things: a clear statement of the business problem being solved, an account of how target audiences currently behave digitally, a set of channel and content choices grounded in that behaviour, and a measurement framework that connects activity to commercial outcomes.
That sounds simple. It rarely is in practice. The business problem is usually murkier than anyone admits. “We want to grow” is not a business problem. “We are losing market share among 35-to-50-year-old buyers in the South East because our consideration scores have dropped 12 points in two years” is a business problem. One gives you something to solve. The other gives you nothing to work with.
Audience behaviour is where most strategies go thin. Teams spend time on demographic profiles and almost no time on what those people actually do online, where they go when they are in-market, what content they consume before making a decision, and where they drop out of the funnel. Without that, channel selection is guesswork with a budget attached.
If you want a broader frame for where digital strategy sits within commercial planning, the Go-To-Market and Growth Strategy hub covers the full picture, from positioning and market entry through to performance and retention.
How Do You Start Building a Digital Strategy From Scratch?
Start with the business, not the brief. Before any channel is discussed, you need three things on the table: what commercial outcome the business is trying to achieve, what is currently stopping it from achieving that outcome, and what role digital can realistically play in closing that gap.
When I took over as CEO at iProspect UK, the agency had been running at a loss. One of the first things I did was go back to first principles with the team: what are clients actually trying to achieve, and are we measuring our work against those outcomes or against proxy metrics that make us look busy? The answer, in too many cases, was the latter. Impressions, clicks, and rankings were being reported as success when the underlying business problem, whether that was revenue, leads, or customer acquisition cost, had not moved.
That experience shaped how I think about digital strategy development. The diagnostic phase is not optional. You cannot write a credible strategy without understanding the commercial context, and you cannot understand the commercial context without asking questions that most marketing teams are reluctant to ask because the answers are uncomfortable.
A practical starting framework looks like this:
- Business objective: What specific commercial outcome are we trying to achieve, and by when?
- Current baseline: Where are we starting from? What does performance look like today across revenue, acquisition, retention, and margin?
- Audience behaviour: How do our target customers currently behave digitally? Where do they discover, research, compare, and buy?
- Competitive landscape: What are competitors doing digitally, where are they investing, and where are the gaps?
- Resource reality: What budget, team capacity, and technology do we actually have to work with?
- Measurement framework: How will we know if this is working, and what are the milestones that tell us we are on track?
None of that is revolutionary. But the discipline of working through it rigorously, rather than jumping to channel selection because that is the comfortable part, separates strategies that hold up from ones that fall apart by Q2.
Which Channels Should a Digital Strategy Include?
The channels that belong in a digital strategy are the ones where your specific audience is active and where the economics of reaching them make commercial sense. That is the only honest answer. Everything else is opinion dressed up as best practice.
I spent time early in my career at lastminute.com, which was a genuinely fast-moving digital environment. One of the clearest lessons from that period was how quickly you could validate or invalidate a channel assumption if you were willing to test with real money and real data. We launched a paid search campaign for a music festival and saw six figures of revenue in roughly a day from a relatively straightforward setup. Not because paid search was inherently brilliant, but because the audience intent was there, the offer matched it, and the conversion path was clean. The channel worked because the conditions were right, not because someone had declared it a priority.
That experience stuck with me. Channel selection should follow audience behaviour and commercial logic, not industry trend reports. Looking at how high-growth businesses have approached channel strategy can be instructive, but the patterns only matter if they reflect how your audience actually behaves, not how someone else’s does.
A useful way to think about channel selection is across three roles: channels that build demand, channels that capture demand, and channels that retain and grow existing customers. Most digital strategies over-index on demand capture (paid search, retargeting, conversion-focused activity) because it is the easiest to measure and the quickest to show results. That is fine in the short term. Over time, it creates a business that is entirely dependent on in-market demand it did not generate, which is a fragile position to be in.
Demand-building channels, typically content, SEO, social, and brand-led activity, take longer to show returns and are harder to attribute cleanly. That does not make them less valuable. It makes them harder to defend in a quarterly review, which is a different problem.
How Do You Prioritise When Resources Are Limited?
Prioritisation is where most digital strategies quietly collapse. The strategy document says “we will invest in SEO, paid search, email, social, and content marketing.” The budget and team capacity say that is not realistic. Rather than make a hard call, the team spreads resources across everything and does none of it particularly well.
I saw this pattern repeatedly when running agencies. Clients would brief us on a full-funnel digital strategy, we would scope it properly, and then the actual budget would arrive at roughly 40% of what the scope required. The instinct on both sides was to try to do everything at reduced quality. The better answer, which was always a harder conversation to have, was to do fewer things properly.
A simple prioritisation test: for each channel or initiative in the strategy, ask what the expected return is, how long it will take to see that return, and what it will cost in budget and team time to execute it properly. Then rank them by the combination of impact, speed, and feasibility. The things that score well on all three get resourced first. The rest either wait or get cut.
This is not a sophisticated framework. It is common sense applied with discipline. The reason it gets skipped is that cutting things from a strategy feels like failure. It is not. It is the job. A strategy that tries to do everything is not a strategy, it is a wish list.
Forrester’s intelligent growth model makes a similar point: sustainable growth requires focus, not breadth. Spreading investment too thin across too many channels is one of the most common reasons digital strategies underperform against their stated objectives.
What Role Does Data Play in Digital Strategy Development?
Data informs strategy. It does not write it. That distinction matters more than most teams acknowledge.
When I started in marketing around 2000, data was scarce and the instinct was to get more of it. Now data is abundant and the instinct is still to get more of it, which suggests the problem was never really about data volume. The problem is interpretation: knowing what the numbers mean, what they do not mean, and what questions they cannot answer.
Analytics tools give you a perspective on reality. They are not reality itself. Last-click attribution tells you which channel got credit for a conversion. It does not tell you which channels influenced the decision to convert. Customer acquisition cost tells you what you paid to acquire a customer. It does not tell you whether that customer was worth acquiring. These distinctions sound pedantic until you make a significant budget decision based on a metric that was measuring the wrong thing.
The data layer of a digital strategy should answer three questions: what is happening, why it is happening, and what we should do about it. Most analytics setups are good at the first, patchy on the second, and almost useless on the third. Closing that gap requires combining quantitative data with qualitative insight, which means talking to customers, running user research, and not treating a dashboard as a substitute for thinking.
Qualitative feedback tools can surface the kind of context that quantitative data misses entirely, particularly around why users behave the way they do on-site and where the friction points in a conversion experience actually sit.
How Do You Build a Measurement Framework That Actually Works?
A measurement framework that works connects digital activity to commercial outcomes through a clear chain of logic. It does not start with the metrics that are easiest to track. It starts with the outcome the business cares about and works backwards to identify which leading indicators reliably predict that outcome.
When I judged the Effie Awards, one of the things that separated the entries that won from the ones that did not was the quality of their measurement thinking. The winners could demonstrate a clear line between what they did and what changed commercially. The losers had impressive-looking results, but when you pressed on the logic, the connection between activity and outcome was assumed rather than demonstrated. A lot of digital marketing reporting has the same problem. The numbers look good. The business case is thin.
A functional measurement framework has four layers. First, business KPIs: revenue, customer acquisition cost, lifetime value, market share. These are the outcomes the strategy exists to move. Second, channel KPIs: the metrics that indicate whether each channel is performing its intended role, whether that is driving awareness, generating qualified leads, or converting in-market buyers. Third, diagnostic metrics: the data points that help you understand why channel KPIs are moving in a particular direction, things like quality score, engagement rate, or on-site conversion rate. Fourth, operational metrics: the activity and efficiency measures that tell you whether execution is on track.
The most common failure mode is treating diagnostic or operational metrics as business KPIs. When a marketing team reports that organic traffic is up 40%, that is a diagnostic metric. It matters if it is connected to something that matters commercially. On its own, it is interesting but not meaningful.
What Makes Digital Strategies Fail at the Execution Stage?
Most digital strategies that fail do not fail because the strategy was wrong. They fail because the conditions required to execute it were never in place. That includes clear ownership of each workstream, realistic timelines, sufficient budget, and the organisational alignment to make decisions quickly when things do not go to plan.
I grew iProspect UK from around 20 people to over 100 during my time there. One of the consistent challenges at every stage of that growth was the gap between what we planned and what we could actually deliver with the team and systems we had. The strategies were sound. The execution capacity was the constraint. Learning to build strategies that matched execution reality, rather than theoretical best practice, was one of the more useful things I did as a leader.
There are a few specific failure patterns worth naming. The first is the strategy that never gets operationalised. It sits in a deck, gets presented to the board, and then everyone goes back to doing what they were doing before. The second is the strategy that gets operationalised but never reviewed. Markets change, audience behaviour shifts, and a strategy that was right in January may need significant adjustment by June. The third is the strategy that is reviewed but never actually changed, because the organisation has too much invested in the original plan to admit it needs updating.
Agile principles applied to marketing strategy are worth understanding here, not as a methodology to follow rigidly, but as a reminder that strategy should be a living document, not a fixed plan. The businesses that execute digital strategy well tend to be the ones that review it regularly, make adjustments based on evidence, and treat the original plan as a starting point rather than a commitment.
Execution also requires the right tools in the right places. Knowing which tools support which parts of a digital strategy matters, but tool selection should follow strategic need, not the other way around. Buying a platform before you have clarity on what problem it is solving is one of the more expensive mistakes a marketing team can make.
How Do You Keep a Digital Strategy Connected to the Business Over Time?
The connection between digital strategy and business performance tends to erode over time, not because the strategy stops being relevant, but because the review cadence gets deprioritised, the team gets absorbed in execution, and the original strategic logic gets forgotten. Keeping that connection alive requires deliberate effort.
A quarterly strategy review is the minimum. It should cover whether the business objectives the strategy was built around have changed, whether the audience behaviour assumptions still hold, whether the channel mix is delivering against its intended role, and whether the measurement framework is still capturing the right things. That is a two-hour conversation, not a two-day offsite. The discipline is in doing it consistently, not elaborately.
The other thing that keeps a strategy grounded is maintaining a direct line between the marketing team and commercial performance data. Not just marketing metrics, but revenue, margin, customer lifetime value, and churn. When marketing teams operate in isolation from those numbers, the strategy drifts towards optimising for things that are easy to measure rather than things that matter commercially. That drift is slow and almost invisible until it becomes a problem.
Growth-focused marketing thinking has its uses here, particularly the emphasis on testing, iteration, and connecting activity to commercial outcomes. The risk is treating it as a set of tactics rather than a mindset. The mindset is what matters: stay close to the data, test assumptions, and be willing to change course when the evidence points in a different direction.
For teams working through how digital strategy connects to broader go-to-market planning, the full range of frameworks and thinking around growth strategy and market positioning is worth exploring alongside the channel-specific work.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
