Adtech Systems Are Eating Your Budget From the Inside

Adtech systems are the infrastructure layer of modern paid media: the DSPs, SSPs, DMPs, CDPs, ad servers, and measurement platforms that sit between your budget and your audience. When they work well, they create speed, scale, and targeting precision that no manual operation could match. When they don’t, they quietly consume a significant portion of every pound or dollar you spend before a single impression reaches a real person.

Most senior marketers understand the individual components. Fewer understand how they interact, where the value leaks, and which parts of the stack are genuinely earning their place versus adding cost and complexity in exchange for the appearance of sophistication.

Key Takeaways

  • Adtech stacks compound cost at every layer: fees, data costs, and intermediary margins routinely consume 40-60% of programmatic budgets before reaching working media.
  • Most adtech complexity is sold to marketers, not built for them. Vendors benefit from opacity; your job is to pressure-test every component against actual business outcomes.
  • The measurement problem in adtech is structural, not technical. More tools do not produce more truth. They produce more data that requires more interpretation.
  • Lower-funnel adtech optimises for captured demand, not created demand. If your stack is entirely oriented around intent signals, you are fishing in a shrinking pond.
  • Stack rationalisation is a commercial exercise, not a technical one. The question is not which tools are best in class, but which combination produces the best return on total investment.

What Is an Adtech System, and Why Does the Architecture Matter?

An adtech system is the connected set of technology platforms that plan, buy, deliver, track, and optimise paid media. In a typical mid-to-large organisation, this includes a demand-side platform for programmatic buying, an ad server for trafficking and delivery, a data management or customer data platform for audience segmentation, a clean room or identity solution for privacy-compliant matching, and a measurement layer that might include attribution tools, media mix modelling, or both.

The architecture matters because each connection point is also a cost point, a data loss point, and a potential misalignment point. When I was running agency operations and overseeing hundreds of millions in ad spend across multiple verticals, one of the most consistent findings in any account audit was that clients had accumulated technology over time without ever rationalising it. They had three attribution tools producing three different answers, a DMP that was barely integrated with their DSP, and a first-party data asset that nobody had properly activated. The stack had grown in layers, each layer added for a legitimate reason at the time, but nobody had ever looked at the whole thing as a system.

If you are working through a broader growth strategy review, the Go-To-Market and Growth Strategy hub covers the commercial frameworks that should sit above any technology decision. Adtech is an enabler. It should never be the starting point.

Where the Money Goes Before It Reaches Your Audience

Programmatic advertising introduced extraordinary efficiency in some areas and extraordinary opacity in others. The supply chain between an advertiser’s budget and a publisher’s inventory passes through multiple intermediaries, each taking a margin. DSP fees, SSP fees, data costs, verification costs, ad serving costs, and brand safety tools all sit in that chain. In many programmatic campaigns, working media (the amount that actually buys impressions) is a minority of the total spend.

This is not a conspiracy. It is a structural feature of a system built by vendors who benefit from complexity. The answer is not to abandon programmatic, which at scale genuinely delivers reach and targeting that direct buys cannot match. The answer is to understand where your specific stack is extracting value versus delivering it.

Before you can do that, you need a clear picture of your current digital marketing infrastructure and how it maps to commercial outcomes. The digital marketing due diligence framework is a useful starting point for that kind of audit, particularly if you are inheriting a stack or evaluating one as part of a commercial review.

A few specific areas where budget tends to leak without obvious visibility:

  • Third-party data costs: Audience segments bought through a DMP or data marketplace are often priced at CPMs that significantly inflate effective media cost. Many of these segments perform no better than first-party or contextual alternatives.
  • Frequency mismanagement: Without a unified identity layer, the same user sees the same ad repeatedly across devices and environments, burning impressions and damaging brand perception simultaneously.
  • MFA inventory: Made-for-advertising sites continue to absorb programmatic spend through low-quality supply paths that pass basic brand safety checks but deliver zero commercial value.
  • Attribution inflation: Last-click and even multi-touch models tend to over-credit lower-funnel touchpoints, making retargeting and branded search look more efficient than they are and starving upper-funnel investment of budget it deserves.

The Performance Marketing Trap Inside Your Adtech Stack

Earlier in my career, I was as guilty as anyone of over-indexing on lower-funnel performance metrics. The numbers looked clean. ROAS was strong. Cost per acquisition was tracking well. It felt like control.

What I came to understand over time is that a significant portion of what lower-funnel adtech gets credited for was going to happen anyway. Someone who was already in-market, already aware of your brand, already close to a purchase decision, converts after seeing a retargeting ad. The ad server records a conversion. The attribution model awards credit. The ROAS number improves. But you did not create that demand. You captured it, and you paid for the privilege of capturing something that was already moving in your direction.

The clothes shop analogy has always stayed with me: someone who tries something on is many times more likely to buy than someone browsing the rail. That is not the adtech working. That is the natural purchase funnel. The question is whether your adtech is helping you reach the person at the rail, or just standing at the fitting room door collecting credit.

This matters for how you configure your stack. If your DSP is optimising purely against conversion signals, it will systematically deprioritise upper-funnel inventory in favour of retargeting pools and high-intent segments. Over time, your addressable audience shrinks, your CPMs on that audience rise, and growth flatlines. Market penetration requires reaching people who do not yet know they need you, and most adtech stacks are not set up to do that well.

This dynamic is particularly acute in sectors with longer sales cycles, like B2B technology or financial services, where the gap between brand exposure and measurable conversion can span months. If your measurement window is 30 days and your sales cycle is 90, your adtech is structurally incapable of telling you the truth about what is working.

First-Party Data and the Identity Problem

The deprecation of third-party cookies has been discussed so extensively that most marketers have become numb to it. But the underlying shift is real and the adtech implications are significant. The industry spent a decade building targeting infrastructure on top of a data layer that is now structurally weakening. The response has been a proliferation of identity solutions, clean rooms, cohort-based targeting, and contextual alternatives, each with its own vendor, its own integration requirements, and its own set of limitations.

The marketers who are handling this well are not the ones who have adopted the most sophisticated identity solution. They are the ones who invested early in first-party data collection, consent management, and CRM integration. They have something real to work with. Everyone else is buying data from someone who may or may not have collected it properly, through an intermediary who may or may not be passing it cleanly, into a system that may or may not be matching it accurately.

For B2B marketers in particular, this is an area where the gap between best practice and common practice is wide. If you are running paid media in financial services or professional services, the quality of your first-party data is likely to be a more significant competitive advantage than any DSP feature set. The B2B financial services marketing piece covers some of the specific constraints and opportunities in that sector, including how data strategy intersects with compliance requirements.

A practical step that most organisations underinvest in is a proper audit of what first-party data they actually have, how it is structured, and whether it is genuinely usable for media activation. Before evaluating any new adtech, run a company website analysis to understand what behavioural signals you are already collecting and whether your current stack is using them effectively. Most organisations are sitting on data they have not activated.

Measurement: More Tools, Not More Truth

When I judged the Effie Awards, one of the things that struck me most consistently was how few entries could demonstrate a credible connection between their media investment and business outcomes. Not because the work was not effective, but because the measurement frameworks were not built to show it. Brands were measuring clicks and conversions and viewability scores and reach, but the causal chain between those metrics and revenue was largely assumed rather than demonstrated.

Adtech has made this worse in some ways by multiplying the number of metrics available while making it harder to know which ones matter. Every platform produces its own attribution report. Every report tells a slightly different story. The temptation is to add another measurement tool to reconcile the discrepancy. That tool produces a third story.

The more commercially useful approach is to accept that measurement in advertising is always an approximation and to build a framework that is honest about its limitations rather than one that produces false precision. Media mix modelling gives you a macro view of channel contribution with a significant lag. Incrementality testing gives you cleaner causal evidence for specific decisions but requires volume and patience. Platform attribution gives you operational signal but systematically over-credits the platforms running it. Used together, with appropriate scepticism, they produce something workable. Used in isolation, any one of them will mislead you.

This is also relevant when evaluating performance-based commercial models. Pay per appointment lead generation arrangements, for example, require clear measurement of what constitutes a qualified outcome and who gets credit for it. The adtech stack underpinning those arrangements needs to be set up to answer that question cleanly, not to maximise the metric that benefits the vendor.

Endemic and Contextual Advertising: The Stack Configuration Most Teams Get Wrong

One of the areas where I see adtech stacks consistently underperform is contextual and endemic advertising. The industry spent years treating contextual as a fallback for when audience targeting was unavailable. That framing was always wrong, and it is increasingly untenable now that audience-based targeting is becoming less reliable.

Contextual advertising, placing ads in environments that are directly relevant to your product or service category, works for a different reason than audience targeting. It reaches people in the right mindset, not just people who match a demographic or behavioural profile. Endemic advertising takes this further by operating within category-specific media environments where the audience and the content are both aligned with your offer. The adtech configuration for this is fundamentally different from a broad programmatic buy: you are prioritising environment quality over audience scale, and you need different bidding logic, different creative specifications, and different measurement approaches to reflect that.

Most DSPs can handle contextual targeting, but the default optimisation settings are built for audience-based buying. If you plug a contextual strategy into a stack configured for retargeting and conversion optimisation, the algorithm will deprioritise your best placements in favour of cheaper inventory that scores better on its training data. You have to configure explicitly for the strategy you are running, not assume the platform will figure it out.

Rationalising the Stack: A Commercial, Not Technical, Exercise

I have been through stack rationalisation exercises at both the agency and client side, and the single biggest mistake is treating them as IT projects. Technology teams assess integration complexity, security compliance, and vendor support. Those things matter, but they are not the primary question. The primary question is: which combination of tools produces the best return on total investment, including the cost of the tools themselves, the cost of the people needed to operate them, and the opportunity cost of the complexity they introduce?

That framing changes the conversation significantly. A best-in-class DSP that requires three additional integrations, a dedicated trader, and a six-month onboarding process may produce worse total returns than a simpler platform that your existing team can operate effectively from day one. The marginal targeting capability of the more sophisticated tool is only valuable if you have the data quality, the creative production capacity, and the measurement infrastructure to take advantage of it.

For organisations operating across multiple business units, this complexity multiplies. A corporate adtech stack that tries to serve both brand and performance objectives across multiple product lines often ends up serving neither well. The corporate and business unit marketing framework for B2B tech companies addresses how to structure marketing operations at scale, including the technology governance questions that determine whether your stack is an asset or a liability.

A useful heuristic: if you cannot explain in plain language what each tool in your stack does and what it costs in total (licence, integration, and people), you are probably carrying dead weight. I have seen organisations paying for DMPs they stopped actively using eighteen months earlier because nobody had formally reviewed the contract. The stack had become infrastructure by inertia rather than by design.

What Good Adtech Governance Looks Like

The early days at Cybercom taught me something that has stayed with me across every subsequent role. I was handed a whiteboard pen mid-brainstorm, in front of a room full of people expecting someone more senior, and the only real option was to think clearly and say something useful. Adtech governance requires the same disposition: clarity over sophistication, honesty about what you know and what you don’t, and a willingness to say when the emperor has no clothes.

Good governance means having a named owner for each platform in the stack, a defined review cadence, and a clear set of commercial criteria against which each tool is evaluated annually. It means your measurement framework is documented and agreed across marketing, finance, and commercial leadership, not just understood by the analytics team. It means your agency or trading desk partners can explain exactly where your money goes and what margin they are taking, not just what your ROAS was last month.

It also means being honest about what adtech can and cannot do. It can create efficiency at scale. It can improve targeting precision within the bounds of available data. It can automate decisions that would otherwise require significant manual resource. What it cannot do is replace strategic thinking about which audiences matter, what messages resonate, and whether you are building genuine demand or just harvesting existing intent. Those are human decisions, and no amount of machine learning changes that.

The reason go-to-market feels harder for many organisations right now is not that the tools have gotten worse. It is that the easy gains from audience targeting have compressed, the cost of data has increased, and the measurement environment has become more complex. The organisations that are pulling ahead are the ones treating adtech as one component of a coherent commercial strategy, not as the strategy itself.

If you want to think through the broader strategic context for any of these decisions, the Go-To-Market and Growth Strategy hub is where the commercial frameworks that sit above technology choices are covered in more depth. Adtech without strategy is just expensive plumbing.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What is an adtech system and what does it include?
An adtech system is the connected set of technology platforms used to plan, buy, deliver, track, and optimise paid media. It typically includes a demand-side platform (DSP) for programmatic buying, an ad server for campaign trafficking, a data management platform (DMP) or customer data platform (CDP) for audience segmentation, identity and clean room solutions for privacy-compliant matching, and a measurement layer covering attribution and media mix modelling. The architecture and integration quality of these components determines how much of your budget reaches working media versus being consumed by fees, data costs, and intermediary margins.
How much of a programmatic budget is typically lost to adtech fees?
The proportion varies significantly depending on the supply path, the data sources used, and the number of intermediaries involved. In complex programmatic setups, DSP fees, SSP fees, third-party data costs, ad verification, and brand safety tools can collectively consume a substantial share of total spend before a single impression is served. The most reliable way to understand your specific situation is to request a transparent cost breakdown from your DSP and trading partners, including all data and tech fees, and to compare working media as a percentage of total investment against industry benchmarks.
What is the difference between a DMP and a CDP in an adtech stack?
A data management platform (DMP) was designed primarily for anonymous, cookie-based audience segmentation used in programmatic advertising. It aggregates first, second, and third-party data to build targetable segments but typically does not store persistent individual-level records. A customer data platform (CDP) is built around persistent, identified customer profiles, integrating data from CRM, website behaviour, email, and other sources to create a unified view of individual customers. As third-party cookies decline, CDPs are increasingly central to adtech stacks because they enable first-party data activation at scale. The two platforms serve different purposes and many organisations need both, though the balance is shifting toward CDP-led architectures.
How should you measure the effectiveness of your adtech stack?
Effective measurement requires a combination of approaches rather than reliance on any single tool. Platform attribution provides operational signal but systematically over-credits the channels running the reports. Media mix modelling gives a macro view of channel contribution with a meaningful time lag. Incrementality testing through holdout experiments provides cleaner causal evidence for specific investment decisions. The most commercially honest approach is to triangulate across these methods, document the assumptions and limitations of each, and align on a measurement framework with finance and commercial leadership, not just the marketing analytics team. Adding more measurement tools does not produce more truth; it produces more data that requires more interpretation.
When should a business rationalise its adtech stack?
Stack rationalisation is worth undertaking when you have more than one tool performing overlapping functions, when your measurement outputs are inconsistent across platforms, when the total cost of the stack (including licence fees, integration resource, and people costs) is difficult to justify against demonstrable commercial outcomes, or when a change in strategy (such as a shift toward first-party data or a new market entry) means your existing configuration is no longer fit for purpose. The exercise should be led commercially, not technically. The question is not which tools are best in class in isolation, but which combination produces the best return on total investment given your specific objectives, data assets, and team capabilities.

Similar Posts