ROAS-Driven Creative Strategy Is Killing Your Brand

ROAS-driven creative strategy is the dominant framework in performance agencies right now, and it is producing a generation of ads that perform brilliantly in the short term and accomplish almost nothing over time. When every creative decision is filtered through what the last campaign’s return on ad spend data says, you stop making advertising and start making optimised content that slowly erodes the thing that makes advertising work: distinctiveness.

The problem is not that ROAS is a bad metric. The problem is that agencies have let it become a creative brief. Those are not the same thing, and conflating them is costing clients more than most performance dashboards will ever show.

Key Takeaways

  • ROAS measures what happened after a creative decision. It cannot tell you what creative to make next. Agencies that use it as a brief are working backwards.
  • Short-term performance data systematically favours demand capture over demand creation, which means ROAS-optimised creative tends to harvest existing intent rather than build new audiences.
  • When creative strategy is owned entirely by performance teams, brand distinctiveness erodes quietly. The decline rarely shows up in weekly ROAS reports until it is already a serious problem.
  • The agencies doing this well are not ignoring performance data. They are using it to answer execution questions, not strategic ones. The distinction matters enormously.
  • Clients share responsibility here. If you brief an agency on ROAS and nothing else, you will get ROAS and nothing else. The brief shapes the strategy whether you intend it to or not.

Why Agencies Default to ROAS as a Creative Framework

It is worth being honest about why this happens before criticising it. ROAS is clean. It is reportable. It gives account teams something to show in a monthly review that looks like evidence of competence. When I was running agencies, I understood the commercial pressure behind it. Clients want numbers. Numbers need to go up. ROAS goes up when you optimise creative against what is already converting, so you optimise creative against what is already converting. The logic is internally consistent even if the outcome is strategically damaging.

The deeper issue is structural. Most performance agencies are not set up to think about brand. Their planning functions, if they exist at all, are oriented around media efficiency. Their creative teams, if they have them, are producing assets at volume rather than working through a strategic brief. And their reporting cycles are weekly or monthly, which is exactly the wrong cadence for measuring anything that brand advertising does. So ROAS fills the vacuum. It is available, it is legible, and it creates the appearance of rigour even when the underlying creative thinking is absent.

I have sat in enough agency new business pitches on both sides of the table to know that this is rarely a cynical choice. Most performance agencies genuinely believe that optimising for conversion metrics is the same as optimising for business outcomes. It is not, but the distinction is subtle enough that it gets lost in the pressure of day-to-day account management.

What ROAS Can and Cannot Tell You About Creative

ROAS tells you the ratio of revenue to ad spend over a given period. That is genuinely useful information. It tells you that a campaign generated returns above or below a threshold. What it cannot tell you is why, which creative elements drove that outcome, whether the same approach will work at higher spend, whether it is building or eroding brand equity, or what you should make next.

A few years ago I was reviewing a proposal from a large network agency that was very excited about an AI-driven personalised creative solution. The pitch was that their system had delivered a 90% reduction in CPA and a significant uplift in conversion rate for a retail client. The numbers were presented as proof that AI-driven creative was the future. When I pushed on the baseline, it turned out the client had been running the same static banner creative for eighteen months with no refresh. The AI system had replaced genuinely stale, poorly constructed ads with something marginally more relevant. Of course performance improved. That is not AI success. That is a low baseline doing most of the work, and dressing it up as a technology breakthrough is the kind of thing that gets agencies into trouble with clients who eventually work out what actually happened.

The same logic applies to ROAS optimisation. If your creative has been poor and you tighten the feedback loop against conversion data, performance will improve. That improvement will be attributed to the ROAS-driven approach. But you have not discovered a better creative strategy. You have corrected for a previous failure, and the correction has a ceiling that the data will not warn you about until you hit it.

If you are building a broader content and editorial framework around your performance activity, the Content Strategy and Editorial hub covers how these decisions connect across channels and planning horizons.

The Demand Capture Problem

There is a structural bias built into ROAS optimisation that almost nobody talks about openly. When you optimise creative against conversion data, you are by definition optimising for the people who were already close to buying. You are capturing demand that existed. You are not creating it.

This is not a new observation. The tension between brand building and performance marketing has been a live debate for years, and there is a reasonable body of thinking, from practitioners like Les Binet and Peter Field among others, around how the two work differently across time horizons. What is new is the degree to which ROAS has become the single lens through which creative decisions are made at agencies that are managing significant budgets for major clients.

The practical consequence is that ROAS-optimised creative tends to converge on a narrow set of formats, messages, and audience segments that are already in-market. It works extremely well for the bottom of the funnel. It does almost nothing for the top. And because the top of the funnel is where brand equity, future demand, and pricing power are built, the long-run cost of ignoring it is substantial. It just does not show up in this quarter’s numbers, which is why it keeps happening.

When I grew the iProspect team from around 20 people to over 100 and moved the business from loss-making to one of the top five agencies in its category, one of the things that made that possible was being honest with clients about what performance media could and could not do. Clients who understood that distinction made better decisions about budget allocation. Clients who were told that ROAS was the whole answer tended to cut brand spend, see short-term gains, and then wonder why new customer acquisition was getting progressively more expensive. The pattern was consistent enough that it stopped being a surprise.

How Creative Strategy Gets Corrupted by Performance Data

The corruption happens gradually and it is almost always well-intentioned. A campaign runs. Some ads perform better than others. The agency pulls the data, identifies the top performers, and recommends that future creative follows those patterns. This is presented as evidence-based creative strategy. It is actually creative strategy constrained by a sample of past behaviour, which is a very different thing.

The problem is that past performance data cannot tell you about creative territory you have not explored. If you have only ever run direct response ads with promotional messaging, your data will tell you which promotional messages convert best. It will not tell you that a different creative approach, something more distinctive, more emotionally resonant, more brand-building, might outperform all of them. The data has no visibility on what you have not tested, and agencies that are optimising against existing data are systematically excluding the creative options they know least about.

I have judged the Effie Awards, which are specifically designed to recognise marketing effectiveness rather than creative craft for its own sake. What strikes you when you read through the case studies is how often the work that drives the largest commercial outcomes is work that took a creative risk the data would not have supported in advance. The brands that win Effies are not the ones that ran the most ROAS-optimised campaigns. They are the ones that had a clear strategic point of view and executed it with enough consistency and distinctiveness that it built something durable. Those are not the same creative decisions, and the Effie process is useful precisely because it forces you to think about outcomes over a meaningful time horizon rather than a reporting cycle.

A well-structured content strategy, including how editorial decisions connect to performance objectives, is covered in more depth across The Marketing Juice content strategy section. The principles that apply to editorial also apply to paid creative, more often than either side of the industry tends to acknowledge.

What a Better Framework Actually Looks Like

The agencies doing this well are not ignoring ROAS. They are using it to answer a specific and limited set of questions: which executions are converting, at what cost, against which audiences. Those are execution questions. They are useful. They should inform media decisions, bidding strategy, and audience targeting. They should not be the primary input into creative strategy.

Creative strategy belongs upstream of performance data. It should be driven by a clear understanding of the audience, the brand’s position in the market, the competitive landscape, and the specific commercial problem being solved. Performance data then tests executions of that strategy, not the strategy itself. The distinction is between using data to validate creative thinking and using data to replace it.

In practice, this means having a creative strategy that is written down and agreed before the campaign goes live, with clear hypotheses about what it is trying to achieve and how success will be measured across different time horizons. It means having someone in the room, whether that is a strategist, a planner, or a senior creative, who is responsible for the integrity of the creative thinking and who has the standing to push back when performance optimisation is pulling creative in a direction that undermines the strategy. And it means having honest conversations with clients about what the data can and cannot tell you.

That last part is harder than it sounds. Clients are under pressure. They want numbers. Telling a client that their ROAS is strong but their brand health is declining requires a level of trust and a quality of relationship that takes time to build. It also requires the agency to have the commercial confidence to say something the client does not necessarily want to hear. Not every agency has that, and not every client relationship makes it possible.

The Brief Is Where This Gets Fixed or Made Worse

Clients carry more responsibility here than the industry conversation usually acknowledges. If you brief an agency on ROAS targets and nothing else, you will get a ROAS-driven creative strategy. The brief shapes the work. If the brief contains no strategic direction, no brand context, no audience insight, and no indication of what success looks like beyond a conversion metric, the agency will fill those gaps with whatever data they have available. That data is almost always performance data. So the brief becomes the problem.

A better brief specifies the business problem, not just the performance target. It provides context about the audience beyond demographic and behavioural data. It gives the agency a clear sense of what the brand stands for and what creative territory is consistent with that. And it sets expectations about measurement that go beyond weekly ROAS, including some acknowledgement that brand-building activity operates on a different time horizon and needs to be evaluated differently.

This is not complicated in principle. It is difficult in practice because it requires clients to have done strategic work before they brief the agency, and it requires agencies to push back when they receive a brief that is too narrow to produce good creative strategy. Both of those things require a different kind of relationship than the transactional model that most performance agency engagements operate on.

The MarketingProfs framework for B2B content strategy is one of the clearer articulations I have seen of how to connect strategic objectives to execution decisions across a campaign. The principle applies equally well to paid creative briefs, particularly the discipline of starting with the audience problem rather than the channel or format.

The Measurement Problem That Nobody Wants to Solve

Part of why ROAS dominates creative strategy is that it is easy to measure. Brand equity, creative distinctiveness, and long-run demand creation are harder to measure and operate on longer time horizons. Agencies are typically evaluated on monthly or quarterly performance, which means the metrics that are easiest to report within that cycle get the most attention. This is a measurement problem as much as it is a creative problem.

The honest answer is that marketing does not need perfect measurement. It needs honest approximation. You can track brand search volume as a proxy for brand health. You can run periodic brand tracking studies. You can monitor new customer acquisition costs over time as an indicator of whether you are building or depleting the pool of potential buyers. None of these are perfect. All of them are more useful than pretending that ROAS captures everything that matters.

The agencies that I have seen do this well are the ones that have a measurement framework that explicitly acknowledges what each metric is and is not measuring. They use ROAS for what it is good at. They use other indicators for what ROAS cannot see. And they are honest with clients when the data is ambiguous rather than constructing a narrative that makes the numbers look cleaner than they are.

Wistia’s thinking on targeting a niche audience in brand content strategy is a useful counterpoint to the broad-reach optimisation that ROAS-driven campaigns tend to default to. The argument for specificity in audience and message has implications for creative strategy that go well beyond video content.

For teams thinking about how to structure content decisions alongside paid creative, the Moz guide on pillar pages is worth reading for how it frames the relationship between strategic architecture and execution. The planning discipline it describes translates reasonably well to campaign creative hierarchies.

What Clients Should Ask Their Agencies

If you are a client and you suspect your agency’s creative strategy is more ROAS-driven than it should be, there are a few questions worth asking. Not to catch anyone out, but to open a conversation that probably needs to happen.

Ask where the creative brief comes from. If the answer is primarily performance data from the previous campaign, that is worth exploring. Ask what the strategy is trying to achieve beyond conversion in the current period. If there is no clear answer, that is informative. Ask how the agency is thinking about creative distinctiveness and whether there is any process for testing creative territory that the data would not have suggested. Ask what the measurement framework looks like beyond ROAS and whether there are any indicators of brand health being tracked alongside conversion metrics.

These are not hostile questions. They are the questions that a commercially serious client should be asking of any agency managing significant creative investment. The answers will tell you a great deal about whether you have a creative strategy or a conversion optimisation programme dressed up as one.

The Semrush overview of AI in content strategy is a reasonable primer on where data-driven tools are genuinely useful in creative and content decisions, and where the limitations are. The distinction it draws between using AI for efficiency and using it for strategy is one that applies directly to the ROAS question.

The Content Marketing Institute’s approach to content strategy is also worth understanding as a reference point for how strategic thinking about content is supposed to work before execution decisions are made. The sequencing matters, and it is the sequencing that ROAS-driven approaches tend to collapse.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What is ROAS-driven creative strategy?
ROAS-driven creative strategy is an approach where creative decisions, including formats, messages, and audience targeting, are primarily shaped by return on ad spend data from previous campaigns. Rather than starting with a strategic brief, the creative brief is effectively reverse-engineered from performance metrics. It is common in performance agencies and produces reliable short-term conversion results, but tends to underinvest in brand building and long-run demand creation.
Why does ROAS-driven creative hurt brand building?
ROAS optimisation systematically favours creative that converts existing demand. It rewards direct response formats, promotional messaging, and audiences that are already close to purchasing. Brand building works on a different mechanism, creating distinctiveness, emotional associations, and future demand among people who are not yet in-market. ROAS data has no visibility on these audiences, so creative optimised against it tends to ignore them entirely. Over time, this depletes the pool of future buyers and makes new customer acquisition progressively more expensive.
How should agencies use performance data in creative strategy?
Performance data is most useful as an execution tool rather than a strategic one. It should inform decisions about which creative executions are working, at what cost, and against which audiences. It should not be the primary input into what the creative strategy is or what the brand should say. Creative strategy belongs upstream of performance data, driven by audience insight, brand positioning, and a clear articulation of the commercial problem being solved. Performance data then tests executions of that strategy rather than replacing the strategic thinking.
What should a client brief include to avoid ROAS-only creative?
A brief that produces better creative strategy should specify the business problem, not just the performance target. It should include audience context beyond behavioural and demographic data, a clear articulation of the brand’s position and what creative territory is consistent with it, and a measurement framework that acknowledges different time horizons. Clients who brief agencies on ROAS targets and nothing else will receive ROAS-optimised creative and nothing else. The brief shapes the strategy whether that is the intention or not.
How can you measure brand health alongside ROAS?
Several indicators can sit alongside ROAS to give a more complete picture of what a campaign is doing. Brand search volume is a reasonable proxy for brand awareness and consideration. New customer acquisition costs tracked over time indicate whether the brand is building or depleting future demand. Periodic brand tracking studies, even simple ones, capture awareness, consideration, and association metrics that ROAS cannot see. None of these are perfect measures, but together they provide an honest approximation of brand health that is more useful than relying on conversion data alone.

Similar Posts