Account-Based Marketing Campaigns That Close Enterprise Deals
Account-based marketing campaign examples tend to fall into two camps: either they’re so abstract they’re useless, or they’re so brand-specific you can’t extract anything transferable. The best ABM campaigns share a common structure: a defined account list, a clear value proposition mapped to each account’s commercial reality, and coordinated outreach across channels that builds familiarity before it asks for anything.
What separates ABM that closes deals from ABM that generates activity is specificity. Not personalisation for its own sake, but genuine relevance, where the message reflects what the account actually cares about commercially, not just their logo dropped into a banner ad.
Key Takeaways
- ABM works when account selection is rigorous. A weak account list makes every campaign downstream less effective, regardless of creative quality or channel mix.
- The most effective ABM campaigns build familiarity before they ask for a meeting. Sequencing matters more than individual touchpoints.
- Personalisation has a point of diminishing return. Tier your accounts and match personalisation depth to commercial opportunity, not to what’s technically possible.
- ABM is not a replacement for demand generation. It works best when it runs alongside broader market activity, not instead of it.
- Most ABM underperforms because it’s treated as a campaign type rather than a go-to-market motion. Sales and marketing alignment isn’t optional, it’s structural.
In This Article
- What Makes an ABM Campaign Worth Studying?
- Campaign Example 1: Tiered ABM for a B2B SaaS Company Entering Enterprise
- Campaign Example 2: Event-Led ABM for a Professional Services Firm
- Campaign Example 3: Content-Led ABM with Intent Data
- Campaign Example 4: Partner-Led ABM for Market Expansion
- Campaign Example 5: Re-Engagement ABM for Dormant Enterprise Accounts
- The Account Selection Problem Most ABM Programmes Get Wrong
- How to Think About Measurement Without False Precision
I’ve run marketing teams across multiple agency environments and managed campaigns spanning 30 industries. The pattern I’ve seen with ABM is consistent: the organisations that get results treat it as a go-to-market discipline, not a campaign format. The ones that struggle treat it like a fancier version of email marketing with better targeting. The examples in this article are built around that distinction.
What Makes an ABM Campaign Worth Studying?
Before running through specific campaign examples, it’s worth being clear about what we’re evaluating. A campaign is worth studying if it produced a measurable commercial outcome, if the approach is transferable to other contexts, and if it reveals something about how buyers actually behave rather than how we’d like them to behave.
A lot of ABM case studies are really just personalisation theatre. A company sends a prospect a box of branded coffee with a handwritten note, the prospect posts about it on LinkedIn, and it gets written up as a success. Whether it contributed to revenue is rarely the headline. I’ve judged the Effie Awards, where effectiveness is the entire point of the evaluation, and the discipline that process demands is a useful lens for any marketing investment. Did it drive a business outcome? Can you show the chain of causality? If not, it’s an anecdote, not a case study.
The examples below are structured around what the campaign was trying to achieve, how it was built, and what the commercial logic was. The tactics are secondary to the thinking.
If you’re building out a broader go-to-market approach, the Go-To-Market and Growth Strategy hub covers the wider strategic context that ABM sits within, including how to sequence market entry, structure commercial objectives, and think about channel mix across the full funnel.
Campaign Example 1: Tiered ABM for a B2B SaaS Company Entering Enterprise
A mid-market SaaS business had solid traction with companies under 500 employees but wanted to move upmarket. The product could genuinely serve enterprise accounts, but the sales motion was entirely inbound and the brand had no presence in enterprise buying conversations.
The campaign was structured in three tiers. Tier one covered 20 named accounts with the highest revenue potential. These received fully customised outreach: account-specific research decks, direct mail, executive-level LinkedIn engagement, and bespoke content mapped to the specific operational challenges each account was publicly discussing. Tier two covered 80 accounts with high fit scores but lower immediate potential. These received personalised but templated outreach, with account-specific landing pages and sequenced email and LinkedIn touchpoints. Tier three covered 300 accounts that matched the ICP but weren’t prioritised for direct outreach. These received targeted paid social and display, with content designed to build category awareness.
The critical design decision was sequencing. Tier one accounts weren’t approached cold. The campaign ran four weeks of paid social and content distribution to those accounts before any direct outreach began. By the time sales made first contact, the accounts had seen the brand multiple times in a relevant context. That familiarity changed the nature of the first conversation. It wasn’t a cold call, it was a follow-up to something the prospect had already encountered.
This connects to something I’ve come to believe strongly after years in performance marketing: a lot of what gets credited to lower-funnel activity was going to happen anyway. The person who searches your brand name and converts was already in market. The harder and more valuable work is creating the conditions that put you in their consideration set before they start searching. ABM at tier one is exactly that kind of work.
Campaign Example 2: Event-Led ABM for a Professional Services Firm
A professional services firm with a strong regional reputation wanted to build relationships with a specific set of 40 CFOs at target accounts. The challenge was that CFOs don’t respond to standard B2B outreach. Their inboxes are filtered, their time is genuinely scarce, and they’ve seen every variation of the “quick call” email.
The campaign was built around a private dinner series. Not a webinar, not a virtual roundtable, an actual dinner. The firm identified a genuine topic that CFOs in their target sector were wrestling with, brought in two credible external speakers with no commercial agenda, and invited eight CFOs to each event. No sales pitch, no product demo, no branded merchandise. Just a well-run conversation between peers, facilitated by the firm.
The pre-event campaign was where the ABM mechanics came in. Each invitee received a personalised letter, not an email, referencing specific things about their business context. The firm’s team spent time on LinkedIn engaging with each CFO’s content in the weeks before invitations went out. The goal was to be a familiar name, not a stranger, by the time the invitation arrived.
Three dinner events over six months generated 11 active pipeline conversations with tier one accounts. None of those conversations started with a sales pitch. They started because the CFO had experienced the firm’s thinking in a context that felt genuinely useful rather than commercially motivated. The commercial conversation came later, and it came from the prospect, not the seller.
The lesson here isn’t that you need to run dinners. It’s that the format of the engagement should match the audience’s actual preferences and constraints, not what’s convenient to execute at scale. Vidyard’s research on why GTM feels harder touches on exactly this tension: buyers have more information and more noise to filter, which means the bar for earning attention has risen significantly.
Campaign Example 3: Content-Led ABM with Intent Data
A technology company selling infrastructure software used intent data to identify accounts that were actively researching their category. The intent signals were imperfect, as they always are, but they provided a useful filter for prioritising outreach effort.
The campaign logic was straightforward: if an account is already researching the problem your product solves, the conversation you need to have is different from the one you’d have with an account that doesn’t yet recognise the problem. The content strategy was split accordingly. Accounts showing high intent received content that assumed category awareness and focused on differentiation and proof. Accounts showing low or no intent received content designed to frame the problem and create the conditions for a later conversation.
The distribution was primarily LinkedIn Matched Audiences and programmatic display, with content sequenced based on engagement signals. Accounts that engaged with problem-framing content were moved into a more direct outreach sequence. Accounts that didn’t engage stayed in the awareness layer.
What made this campaign work wasn’t the intent data itself. Intent data is a directional signal, not a buying signal, and treating it as the latter leads to a lot of premature outreach that burns goodwill. What made it work was using the data to sequence content intelligently rather than to trigger immediate sales contact. The sales team only received accounts for outreach when those accounts had demonstrated engagement across multiple touchpoints. The quality of conversations improved significantly because the outreach was timed better.
This is a pattern BCG has written about in the context of commercial transformation: the organisations that grow faster tend to be better at sequencing commercial effort, not just at increasing volume of outreach.
Campaign Example 4: Partner-Led ABM for Market Expansion
A company expanding into a new vertical had no existing relationships and no brand recognition in the target market. Cold outreach at scale would have been expensive and low-converting. The approach instead was to identify three partners, technology vendors, industry associations, and a specialist consultancy, who already had trusted relationships with the target accounts.
The ABM campaign ran through those partner channels. Co-branded content, joint webinars, and introductions through existing relationships replaced cold outreach entirely for the top 50 accounts. The partner’s credibility transferred, at least partially, to the new entrant. The accounts that engaged were already warm because the introduction came from a trusted source.
I’ve seen this dynamic work in agency contexts too. When I was building out a new practice area at one of the agencies I ran, the fastest route to credibility wasn’t a campaign about our new capability. It was getting existing clients to talk about results in a context where prospective clients were present. The commercial logic is the same as partner-led ABM: borrowed trust is still trust, and it converts faster than trust built from scratch.
The mechanics of this campaign included co-developed research reports shared with target accounts, joint speaking slots at two industry events, and a referral structure that incentivised partners to make warm introductions. The outreach from the company’s own sales team only began after a partner introduction had been made. The conversion rate from first meeting to second meeting was substantially higher than anything the company had seen from direct outreach in other markets.
For a broader view of how market penetration strategy connects to channel decisions like this, Semrush’s breakdown of market penetration is a useful reference point for thinking about the options available when entering a new segment.
Campaign Example 5: Re-Engagement ABM for Dormant Enterprise Accounts
Not all ABM targets are new prospects. One of the most commercially efficient ABM applications is re-engaging enterprise accounts that had a relationship with the business in the past but had gone quiet. These accounts already know the company, have some level of trust baseline, and are often in a different commercial situation than when the relationship ended.
A manufacturing technology company ran a re-engagement campaign targeting 60 accounts that had either evaluated but not purchased, or had purchased at a small scale and then disengaged. The campaign started with research: understanding what had changed at each account since the last interaction, who the relevant stakeholders were now, and what the account’s current priorities appeared to be based on publicly available signals.
The outreach was honest about the history. Rather than pretending the relationship was starting fresh, the messaging acknowledged the previous interaction and framed the re-engagement around what had changed, either in the product, in the market, or in the account’s own situation. That honesty was disarming. It signalled that the company had done its homework and wasn’t just running a spray-and-pray reactivation campaign.
The campaign generated pipeline from 14 of the 60 accounts within three months. Several of those accounts had been written off internally as lost opportunities. The cost per opportunity was a fraction of what new account acquisition cost because the trust foundation, however thin, already existed.
There’s a useful parallel here to how I think about performance marketing more broadly. Early in my career I overweighted lower-funnel activity because the attribution was clear and the results were immediate. What I came to understand over time is that a lot of that activity was capturing intent that already existed, not creating new demand. Re-engagement ABM is the same trap in reverse: it looks efficient because conversion rates are higher, but if you’re only re-engaging dormant accounts and not building new pipeline, you’re drawing down an asset rather than building one. Both motions need to run simultaneously.
The Account Selection Problem Most ABM Programmes Get Wrong
Every campaign example above depends on one thing being right before anything else: the account list. Poor account selection is the most common reason ABM underperforms, and it’s also the most avoidable. The temptation is to define the ideal customer profile broadly to maximise the addressable universe, then let the campaign filter for engagement. That logic is backwards.
A tight account list with strong commercial logic will outperform a large account list with loose criteria every time. The discipline required is uncomfortable because it means explicitly deciding which accounts you’re not pursuing, and that feels like leaving money on the table. It isn’t. It’s focusing effort where the probability of success is highest.
Account selection criteria should include fit factors, things like company size, sector, technology stack, and buying process, and timing factors, things like growth signals, leadership changes, funding events, or contract renewal windows. Both matter. A perfect fit account with no near-term buying trigger is a long-term nurture target, not an active ABM priority.
BCG’s work on aligning marketing and commercial strategy makes the point that growth organisations are distinguished less by their tactics than by the quality of their commercial targeting decisions. ABM is a good test of that discipline.
How to Think About Measurement Without False Precision
ABM measurement is genuinely difficult and anyone who tells you otherwise is either selling you software or hasn’t run a programme long enough to encounter the problems. The attribution challenge is structural: ABM is designed to create conditions for a sale, not to be the direct cause of one. That means the relationship between campaign activity and revenue is real but indirect, and standard attribution models will undercount it.
The metrics that matter most in practice are account engagement rate across the target list, pipeline coverage from target accounts as a proportion of total pipeline, average deal size and sales cycle length for target accounts versus non-target accounts, and win rate for accounts that went through the ABM programme versus those that didn’t. None of these require perfect attribution. They require consistent tracking over time and honest comparison against a baseline.
What I’d caution against is over-investing in measurement infrastructure at the expense of campaign quality. I’ve seen ABM programmes where 40% of the budget went to intent data platforms, attribution tools, and CRM customisation, and the actual campaign was thin. The measurement was sophisticated but there was nothing worth measuring. Get the campaign right first. The measurement will tell you what to improve, but only if there’s something substantive to measure.
Forrester’s work on scaling go-to-market approaches is relevant here: the organisations that scale well tend to build measurement frameworks that are honest about what they can and can’t attribute, rather than ones that create an illusion of precision.
If you’re thinking about how ABM fits into your broader commercial strategy, the articles in the Go-To-Market and Growth Strategy section cover the wider decisions around channel sequencing, market entry, and commercial prioritisation that ABM programmes sit within. Getting those foundations right makes the individual campaign decisions considerably easier.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
