Marketing Science Conference 2027: What Video Marketers Should Take Away

The Marketing Science Conference 2027 arrives at a moment when video marketers are being asked to do more with less, prove more with imperfect data, and compete for attention in an environment that is noisier than at any point in the last decade. The conference, which brings together some of the sharpest minds in marketing effectiveness, econometrics, and brand science, consistently surfaces findings that challenge what most practitioners think they know about how marketing actually works. If you work in video, the implications are significant and largely underexplored.

What follows is not a conference recap. It is a distillation of the themes, tensions, and practical considerations that any serious video marketer should be thinking about heading into 2027 and beyond.

Key Takeaways

  • Marketing science is increasingly confirming that emotional, brand-building video drives long-term revenue, but most video budgets are still skewed toward short-term activation formats.
  • Attention metrics are becoming a more reliable proxy for video effectiveness than viewability or completion rates, which are easier to game and harder to connect to outcomes.
  • The channel you choose for video distribution shapes the creative brief, not the other way around. Platform selection and creative strategy must be developed together.
  • Virtual and hybrid event formats have permanently changed how video content is produced, distributed, and measured, and most teams have not updated their frameworks accordingly.
  • The biggest risk in video marketing right now is not producing bad creative. It is producing competent creative that is perfectly optimised for the wrong objective.

If you want a broader grounding in how video fits into a full acquisition and brand strategy, the video marketing hub on The Marketing Juice covers the fundamentals, the channel decisions, and the measurement questions that practitioners are actually wrestling with.

Why Marketing Science Keeps Challenging Video Assumptions

There is a version of video marketing that exists almost entirely inside a performance dashboard. Click-through rates, view-through conversions, cost per completed view. I spent years inside agencies where these numbers were presented in client meetings with the kind of confidence that suggested they were settled science. They are not.

Marketing science, particularly the work coming out of the Ehrenberg-Bass Institute and the broader effectiveness research community, consistently points to a gap between what practitioners measure and what actually drives growth. Video is not immune to this. In fact, because video is expensive to produce and emotionally compelling to defend, it is one of the formats most prone to motivated reasoning. Teams fall in love with their creative, optimise for the metrics they can control, and declare success before the business outcome has had time to materialise.

The Marketing Science Conference 2027 is likely to surface this tension again, as it has in previous years. The question is whether video teams are listening or whether they are waiting for the conference to confirm what they already believe.

I judged the Effie Awards for several years. The work that won was rarely the work that looked most impressive in a reel. It was the work that could demonstrate a genuine connection between creative execution and a business outcome. Video entries were often the most beautifully produced and the most weakly argued. The creative brief was strong. The measurement framework was an afterthought.

Attention Is the Metric That Marketing Science Is Taking Seriously

One of the most consistent themes emerging from the marketing science community over the last few years is the inadequacy of viewability as a measure of video effectiveness. An ad can be technically viewable and completely ignored. Completion rates are marginally better but still tell you almost nothing about whether the message landed or whether the viewer was mentally present.

Attention measurement is the area gaining serious traction. Researchers are using eye-tracking, facial coding, and increasingly sophisticated passive measurement to understand not just whether someone saw a video, but whether they were actually paying attention when they saw it. The early findings suggest that attention is a far stronger predictor of brand recall and downstream purchase behaviour than any of the proxy metrics that have dominated video reporting for the last decade.

This matters practically. If you are choosing video marketing platforms based on completion rate benchmarks alone, you may be optimising for a metric that has a weak relationship with the outcomes you actually care about. Platform choice should factor in the attention environment, not just the audience size or the cost per view.

The Semrush overview of video marketing strategy is a useful starting point for understanding how platform selection intersects with content performance, though it stops short of the attention-science angle that the effectiveness community is now prioritising.

The Brand-Activation Balance Is Still Getting Video Wrong

Les Binet and Peter Field’s work on the balance between brand building and sales activation has been circulating in marketing circles for over a decade. The broad finding, that most categories benefit from a roughly 60/40 split in favour of brand investment, is now reasonably well known. What is less well understood is how video budgets are actually allocated in practice.

Most video spend, particularly in digital channels, skews heavily toward activation. Short-form ads, retargeting video, product demo content, video in paid search. These formats are measurable, attributable (at least partially), and easy to defend in a quarterly review. Brand-building video, the kind of emotionally resonant content that works over months and years rather than days and weeks, is harder to justify when the CFO wants to see last-click attribution.

I ran a paid search campaign at lastminute.com for a music festival and watched six figures of revenue land within roughly 24 hours. It was a clean, direct relationship between spend and return. I understand the appeal of that kind of clarity. But that campaign worked partly because the brand had already done the work of being recognisable and trusted. Activation captures demand that brand has already created. Video marketers who only fund activation are drawing down on a reservoir they are not refilling.

The HubSpot analysis of B2B and B2C video marketing trends highlights the growing investment in longer-form brand video, which suggests some teams are starting to rebalance. But the measurement infrastructure to support that rebalancing is still catching up.

Getting this balance right requires being deliberate about what each piece of video content is actually supposed to do. Aligning video content with marketing objectives is not a planning exercise you do once at the start of the year. It is an ongoing editorial discipline that most teams treat as a box-ticking exercise.

How Events and Conferences Are Changing the Video Production Model

The Marketing Science Conference 2027, like most major industry events, will generate a significant volume of video content. Keynote recordings, panel sessions, interview clips, social cutdowns. The question that most marketing teams do not ask clearly enough is: what is this video content actually for, and who is it for beyond the people who attended?

The shift toward hybrid and virtual event formats has created a new challenge. When events were purely physical, video was a secondary output, a nice-to-have record of what happened. When events became virtual or hybrid, video became the primary product. The production standards, distribution strategy, and measurement framework all needed to change accordingly. Most organisations are still running the old mental model with new technology.

If you are producing video content around a conference or industry event, the B2B virtual events framework is worth reviewing before you plan your content calendar. The principles that make virtual events effective, specificity of audience, clarity of purpose, structured engagement, apply equally to the video content you produce around them.

Physical event presence has its own video considerations. The booth experience, the live demonstrations, the capture of authentic moments on the floor. If you are thinking about how to make a physical presence work harder, trade show booth ideas that attract visitors offers a practical lens on the environment in which that video content gets created and consumed in real time.

The Wistia research on why we binge-watch is relevant here too. The same psychological principles that drive sustained attention in entertainment contexts apply to conference video content. Structure, pacing, and the sense of forward momentum matter more than production value alone.

Virtual Environments and the New Expectations for Video Engagement

One of the more interesting developments in the conference and events space is the growing sophistication of virtual environments. Early virtual events were essentially webinars with better branding. The current generation of virtual event platforms offers something meaningfully different: persistent environments, networked interactions, and structured engagement mechanics that can make a virtual conference feel genuinely participatory rather than passive.

Video sits at the centre of this. Whether you are looking at virtual trade show booth examples for inspiration or thinking about how to structure a keynote that holds attention across a two-hour virtual session, the video production decisions you make shape the entire experience. Resolution, aspect ratio, caption quality, interactive overlays, these are not technical afterthoughts. They are the experience.

Engagement mechanics are evolving alongside the video formats. Virtual event gamification is one of the more interesting areas of development, not because gamification is inherently valuable, but because it forces event designers to think about what participation actually looks like and how to reward it. When gamification is done well, it increases the depth of engagement with video content rather than distracting from it. When it is done badly, it turns a serious professional event into something that feels like a corporate fun day.

I built my first website by teaching myself to code because the MD would not give me the budget to hire someone. That experience of working within constraints and finding a way through shaped how I think about production quality. You do not need the most expensive setup to produce effective video content. You need clarity about what you are trying to achieve and the discipline to make decisions that serve that goal rather than your ego.

What Marketing Science Actually Says About Video Effectiveness

The effectiveness research on video is more nuanced than most practitioners acknowledge. A few things stand up consistently across the literature and the conference discussions.

First, longer video is not inherently less effective. The received wisdom that attention spans are collapsing and that only short-form video survives is not well supported by the evidence. What matters is whether the content is relevant and whether it earns the attention it asks for. A six-minute video that delivers genuine value will outperform a fifteen-second video that delivers nothing, even if the completion rate on the shorter format looks better in a dashboard.

Second, emotion drives memory encoding. This is not a creative opinion. It is a finding from cognitive science that the marketing effectiveness community has been applying for years. Video that generates an emotional response, whether that is warmth, surprise, humour, or even mild discomfort, is more likely to be remembered than video that is purely informational. The Copyblogger perspective on online video marketing touches on this, and it remains one of the more durable principles in the space.

Third, context matters more than most video briefs acknowledge. The same creative asset performs differently depending on where it appears, what surrounds it, and what mental state the viewer is in when they encounter it. This is why platform selection is a strategic decision rather than a media buying decision. The Buffer resource on video marketing covers platform-specific performance considerations, and while the landscape shifts constantly, the underlying principle, that context shapes response, is stable.

Finally, lead capture from video is more nuanced than most teams make it. Wistia’s approach to using Turnstile for lead capture is a good case study in how to think about the trade-off between friction and intent. A viewer who completes 80% of a video before encountering a gate is a meaningfully different prospect than someone who bounces from a landing page. Video viewing behaviour is a signal, and most teams are not reading it carefully enough.

The Measurement Problem Has Not Gone Away

Every marketing science conference in recent memory has included at least one session on measurement. The problem has not been solved. It has become more complicated.

The deprecation of third-party cookies, the fragmentation of attention across platforms, the growth of connected TV as a video channel, and the increasing sophistication of privacy regulation have all made it harder to track video effectiveness with the precision that performance marketers became accustomed to in the 2010s. The response from most organisations has been to double down on the metrics they can still measure, even if those metrics have a weak relationship with outcomes.

The MarketingProfs analysis of video marketing ROI measurement is older than it looks, but the core challenge it identifies, that video ROI is genuinely difficult to measure and that most teams are using proxies that do not fully capture the value, remains accurate. The tools have improved. The fundamental problem has not.

What marketing science offers is not a perfect measurement solution. It offers a more honest framework for thinking about what you can and cannot know. Media mix modelling, brand tracking, incrementality testing: these are imperfect instruments, but they are more honest than a last-click attribution model that credits the final touchpoint and ignores everything that created the conditions for conversion.

I spent years managing hundreds of millions in ad spend across multiple agencies. The clients who made the best decisions were not the ones with the most sophisticated dashboards. They were the ones who understood what their data could and could not tell them, and who made judgement calls accordingly. Video marketers need that same intellectual honesty.

The video marketing section of The Marketing Juice covers many of these measurement and strategy questions in more depth. If you are building or rebuilding your video measurement framework, it is worth spending time with the full range of content there rather than looking for a single answer to a genuinely complex problem.

What Video Teams Should Do Differently in 2027

The Marketing Science Conference 2027 will not hand you a new playbook. What it will do, if you engage with it seriously, is challenge the assumptions that are making your current playbook less effective than it should be.

A few practical directions worth pursuing.

Start with the objective, not the format. Too many video briefs begin with a format decision (we need a 30-second pre-roll, we need a LinkedIn video, we need a conference recap reel) and work backward to a rationale. The format should follow from the objective, the audience, and the context. If you cannot articulate what business outcome this video is supposed to influence, you are not ready to brief the creative team.

Build attention into your evaluation criteria. If your current measurement stack does not give you any signal about whether viewers were actually paying attention, consider what you can add. Even simple proxies, like replays, shares, or the depth of viewing on a hosted platform, are more informative than raw completion rates.

Treat platform selection as a strategic input, not a distribution afterthought. The platform shapes the attention environment, the creative constraints, and the audience mindset. Making that decision late in the process, after the creative has been produced, is one of the most common and most costly mistakes in video marketing.

Invest in brand-building video even when it is hard to measure. The measurement difficulty is real, but it is not a reason to underfund the format. It is a reason to build a measurement framework that is honest about what it can and cannot capture, and to make the case for brand investment on the basis of the best available evidence rather than perfect attribution.

And finally, take the marketing science seriously. Not as a set of rules to follow, but as a discipline that will challenge your assumptions and make you a better practitioner. The conference is a good place to start. What you do with it when you get back to your desk is what actually matters.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What is the Marketing Science Conference and why does it matter for video marketers?
The Marketing Science Conference brings together researchers, practitioners, and effectiveness specialists to examine how marketing actually drives business outcomes. For video marketers, it matters because it consistently surfaces evidence that challenges the proxy metrics most teams rely on, particularly around attention, brand building, and the long-term effects of video investment. The findings are not always comfortable, but they are more useful than the benchmarks published by platforms with a commercial interest in your continued spend.
How should video marketers think about attention metrics versus completion rates?
Completion rates tell you whether someone watched to the end of a video. Attention metrics tell you whether they were mentally present while doing so. The marketing science community increasingly treats attention as a stronger predictor of brand recall and downstream purchase behaviour than completion. If your measurement stack only captures completion, you are missing a significant part of the picture. Attention measurement tools are still maturing, but even rough proxies like replays, shares, and viewing depth offer more signal than completion rate alone.
What is the right balance between brand-building video and activation video?
The effectiveness research community, drawing on the work of Binet and Field among others, generally points to a majority allocation toward brand building over time, with the precise split varying by category, competitive context, and brand maturity. In practice, most digital video budgets skew heavily toward activation because it is more measurable. The risk is that activation draws down on brand equity that is not being replenished. There is no universal formula, but any video strategy that cannot articulate how it is building long-term brand memory alongside short-term response is likely to underperform over a multi-year horizon.
How does virtual event video content differ from standard marketing video?
Virtual event video has a different attention contract with the viewer. In a physical event, the environment creates a degree of captive attention. In a virtual setting, the viewer has full control and competing distractions are a click away. This means virtual event video needs to earn attention more actively, through stronger structure, clearer signposting of value, and more deliberate pacing. Production quality matters, but it is secondary to editorial discipline. A well-structured session recorded on a mid-range camera will hold attention better than a poorly structured session filmed in a broadcast studio.
How should video marketers approach measurement when attribution is unreliable?
The honest answer is that video measurement has always involved imperfect proxies, and the current environment has made some of the better proxies harder to use. The practical response is to build a measurement framework that triangulates across multiple signals rather than relying on a single attribution model. Brand tracking, media mix modelling, and incrementality testing each have limitations, but together they give a more honest picture than last-click attribution. The goal is not perfect measurement. It is honest approximation, being clear about what you can and cannot know, and making decisions accordingly.

Similar Posts