AI Content Is Making Social Media Hollow. Sam Altman Noticed.

Sam Altman recently admitted what many marketers have quietly known for a while: AI-generated content is making social media feel less authentic, and the volume problem is getting worse. When the CEO of OpenAI says the tool he built is contributing to a degraded content environment, that is worth sitting with for a moment.

The observation is not new. But it carries more weight coming from Altman, because it signals that even the people building these tools understand the trade-off. More content, faster, cheaper, but at a cost to the signal-to-noise ratio that makes social platforms worth using in the first place.

Key Takeaways

  • Sam Altman’s acknowledgement that AI content is degrading social media authenticity is a commercial signal, not just a cultural one. Brands that flood platforms with AI output are eroding their own brand equity.
  • Volume is not a content strategy. The brands winning on social media right now are producing less content with more editorial judgment, not more content with less.
  • AI works best as a production tool, not a thinking tool. The strategic layer, the point of view, the specific experience, still has to come from a human.
  • Measurement is the underlying issue most brands are ignoring. If you cannot measure whether your content is driving business outcomes, you will keep optimising for output rather than impact.
  • Authenticity is not a tone-of-voice setting. It comes from having something specific to say, and AI cannot manufacture that specificity no matter how well you prompt it.

What Altman Actually Said, and Why It Matters Commercially

Altman’s comments were not a formal warning or a policy position. They were more of an honest observation: that AI-generated content is making social media feel hollow, and that this is a real problem for the quality of online discourse. He framed it as something the industry needs to think about, without offering a clean solution.

For marketers, the commercial implication is more specific than the cultural one. When a platform’s content quality degrades, user attention degrades with it. People scroll faster, engage less, trust less. The brands that have been treating AI content generation as a volume play are contributing to an environment that is actively working against their own performance metrics.

I have watched this dynamic play out across enough client accounts to know it is not theoretical. When content quality drops and volume increases, engagement rates tend to fall, reach costs tend to rise, and the overall return on content investment quietly deteriorates. Most brands do not notice because they are measuring output rather than outcome. They count posts published rather than conversations started, or revenue influenced.

If you want a broader view of where AI sits in the marketing toolkit right now, the AI Marketing hub at The Marketing Juice covers the landscape from automation to content strategy with the same commercial lens.

The Volume Trap: Why More Content Is Not a Strategy

The promise of AI content tools was seductive and, to be fair, partly delivered. You can produce more content faster. That part is true. What the pitch glossed over is that volume without editorial judgment is not a competitive advantage. It is a way of being louder in a room that is already too loud.

When I was running iProspect and we were scaling from around 20 people to closer to 100, one of the consistent lessons from managing content at scale was that quality control becomes harder as volume increases, not easier. The temptation is always to push more out. The discipline is knowing what to hold back. That judgment does not get easier when you hand the production process to an AI, it gets harder, because the friction that used to slow you down was also the friction that made you think twice.

AI removes the production friction. It does not remove the need for editorial judgment. If anything, it makes that judgment more important, because the cost of publishing something mediocre has dropped to almost zero. When something is cheap and easy to produce, the quality floor drops with the price.

The brands I see performing well on social media right now are not the ones producing the most AI content. They are the ones with a clear point of view, a specific audience, and the discipline to say no to content that does not serve either. That is an editorial decision, not a production decision.

Authenticity Is Not a Tone Setting

There is a category error that runs through most conversations about AI and authenticity. Brands treat authenticity as a stylistic quality, something you can dial in with the right prompt. “Write in a conversational, authentic tone.” As if the problem with AI content is that it sounds too formal.

Authenticity is not a tone setting. It is the product of having something specific to say that comes from a specific experience or perspective. When a brand shares a genuine opinion, a real story from inside the business, or a position that not everyone will agree with, that is authentic. When a brand generates five posts a day that could have been written by any company in its sector, that is noise.

AI cannot manufacture specificity. It can approximate the style of specificity, which is part of why the content environment feels hollow. There is a lot of content that sounds like it has a point of view without actually having one. It uses the linguistic markers of authenticity, the first-person voice, the conversational phrasing, the occasional admission of uncertainty, without any of the underlying substance.

I judged the Effie Awards for several years, and the work that consistently stood out was not the work with the biggest budgets or the most sophisticated production. It was the work that had a clear, specific idea at its centre. An idea that came from someone understanding something true about their audience that a competitor had missed. AI can help you execute an idea. It cannot have the idea for you.

Where AI Content Tools Are Genuinely Useful

None of this is an argument against using AI in content production. That would be the wrong conclusion. The tools are genuinely useful, but in a narrower range of applications than the marketing press tends to suggest.

AI is good at production tasks: drafting, reformatting, summarising, adapting existing content for different channels, generating variations for testing. It is good at removing the blank-page problem when you already know what you want to say. It is good at handling the mechanical parts of content work that used to consume disproportionate amounts of time.

Resources like the Moz Whiteboard Friday on generative AI for SEO and content give a grounded view of where these tools add value in the content production process, and where the human layer remains essential. Similarly, Ahrefs’ coverage of AI tools takes a practical rather than evangelical approach, which is the right frame.

What AI is not good at is strategy, editorial judgment, or original thinking. It is not good at deciding what your brand should stand for, which audiences are worth prioritising, or what you should say that your competitors are not saying. Those are human decisions, and outsourcing them to an AI is not efficiency. It is abdication.

The HubSpot guide to choosing the right LLM for marketing is a useful reference if you are evaluating which tools fit which tasks. The framing of the question matters: start with the task, then find the tool, rather than starting with the tool and working backwards.

The Measurement Problem Hiding Behind the Content Problem

There is a deeper issue underneath the authenticity conversation, and it is one I come back to constantly: most brands cannot tell whether their content is working. Not in any commercially meaningful sense. They can tell you how many impressions a post received. They cannot tell you whether it moved a customer closer to a purchase, strengthened brand preference, or influenced a decision that eventually showed up in revenue.

When measurement is weak, the default optimisation target becomes volume. More posts, more reach, more engagement. These are proxy metrics, and they are easy to game. AI content tools are very good at gaming proxy metrics. They can produce content that gets impressions and likes without doing anything for the underlying business.

I spent years managing large ad budgets across multiple sectors, and the consistent pattern was that the brands with the worst measurement frameworks were the ones most susceptible to vanity metrics. They would celebrate a viral post while their conversion rates were flat. They would increase social content volume while their brand consideration scores were declining. The activity looked healthy. The business was not improving.

Fix the measurement, and the content strategy tends to fix itself. When you are measuring outcomes rather than outputs, you quickly discover that the AI content you were producing at scale was not doing much. And you start asking harder questions about what would actually work instead.

The Moz research on AI content touches on some of these performance questions and is worth reading alongside your own data. The Semrush piece on AI optimisation for content strategy also covers some useful ground on connecting content activity to measurable outcomes.

What the Social Platforms Are Likely to Do About It

Platform algorithms are not neutral observers in this dynamic. When content quality degrades at scale, it affects user engagement, which affects the value of the platform, which affects advertising revenue. The platforms have a commercial incentive to surface content that keeps people engaged, and AI-generated content that feels hollow does not do that.

We have already seen signals from several platforms about how they intend to handle AI content. Some are experimenting with labelling. Others are adjusting distribution signals to favour content that generates meaningful engagement over content that simply exists. The direction of travel is fairly clear: platforms will increasingly penalise low-quality volume and reward content that generates genuine interaction.

This is not a new pattern. It is the same pattern that played out in SEO when low-quality content farms were penalised, in email when deliverability algorithms got smarter about engagement signals, and in paid social when relevance scores started affecting cost-per-click. Every time a channel gets flooded with low-quality content, the platform eventually adjusts. AI content is not exempt from this cycle.

The brands that will be in the strongest position when that adjustment happens are the ones that did not build their social strategy around volume in the first place. The ones that invested in having something worth saying, and in producing it well, regardless of whether AI was involved in the production.

What a Better Approach Actually Looks Like

The practical question for most marketing teams is not whether to use AI, it is how to use it without falling into the volume trap. A few principles that hold up across the brands I have seen get this right:

Start with the idea, not the tool. Before you open any AI content tool, you should be able to articulate what you are trying to say and why your audience should care. If you cannot answer those questions, the AI will not answer them for you. It will produce something that sounds like an answer without being one.

Use AI for production, not strategy. Let AI handle the drafting, the reformatting, the variation generation. Keep the strategic decisions, the editorial judgment, the point of view, in human hands. The Semrush overview of ChatGPT in marketing is useful for understanding where the production value genuinely sits.

Set a quality floor, not a volume target. If your content calendar is driven by how many posts you can produce per week, you are optimising for the wrong thing. The question should be: what is the minimum quality threshold for anything we publish, and are we consistently clearing it?

Measure outcomes, not outputs. This is harder, but it is the only way to know whether your content is working. If you cannot connect your social content activity to any business metric that matters, you are producing content on faith. That is a budget allocation problem as much as a content problem.

For teams building out their AI content approach more systematically, the Ahrefs AI SEO webinar with Patrick covers some useful ground on integrating AI tools without compromising content quality signals.

The Bigger Picture: What Altman’s Comment Signals for Marketing

When the person who built the most widely used AI content tool says it is making social media feel inauthentic, the marketing industry should treat that as useful information rather than background noise. It is not a reason to stop using AI. It is a reason to be more deliberate about how you use it.

The brands that will look back on this period positively are not the ones that produced the most AI content. They are the ones that used AI to make their content production more efficient while maintaining the editorial discipline that makes content worth producing in the first place.

That discipline is harder to maintain when the cost of production is near zero. It requires someone in the organisation to hold the line on quality even when the pressure is to publish more. That is a leadership decision as much as a marketing one.

The broader conversation about AI’s role in marketing, from content to automation to measurement, is one I cover regularly at The Marketing Juice AI Marketing hub. If you are trying to build a coherent position on AI rather than just reacting to each new tool, that is a useful place to start.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

Did Sam Altman say AI content is making social media worse?
Altman has publicly acknowledged that AI-generated content is contributing to social media feeling less authentic, noting that the volume of AI output is degrading the quality of online discourse. He framed it as a problem the industry needs to grapple with, without offering a definitive solution.
Is AI-generated content bad for social media marketing?
Not inherently, but the way most brands are using it is creating problems. Using AI to produce high volumes of low-quality content degrades engagement quality, erodes brand trust over time, and contributes to the platform-wide signal degradation that Altman described. AI used as a production tool within a strong editorial framework is a different matter entirely.
How can brands use AI for content without losing authenticity?
The clearest approach is to separate the strategic layer from the production layer. Use AI to draft, reformat, and produce content once you have a clear idea and a defined point of view. Keep editorial judgment, strategic direction, and brand positioning in human hands. Authenticity comes from having something specific to say, and that specificity has to originate with a person who understands the brand and the audience.
Will social media platforms penalise AI-generated content?
The direction of travel suggests yes, at least for low-quality AI content that generates low engagement. Platforms have a commercial incentive to surface content that keeps users engaged, and hollow AI content does not do that. Several platforms are already experimenting with labelling requirements and adjusting distribution signals. The pattern follows what happened in SEO and email when those channels were flooded with low-quality volume.
What should marketing teams measure to know if their social content is working?
The honest answer is that most teams are measuring the wrong things. Impressions, likes, and follower counts are easy to track but weakly connected to business outcomes. A more useful frame is to ask whether your content is influencing consideration, driving traffic that converts, or generating conversations that move customers forward. That requires connecting content activity to downstream metrics, which is harder but far more commercially meaningful than counting posts published.

Similar Posts