SEO Convention: What the Industry Keeps Getting Wrong
SEO convention refers to the widely accepted practices, assumptions, and professional norms that shape how search optimisation is planned, executed, and measured across the industry. Some of those conventions are grounded in real evidence. Many are not. They persist because enough people repeat them with enough confidence, and because the industry has a habit of rewarding activity over outcomes.
This article is not a list of tactics. It is an examination of the beliefs that sit underneath the tactics, and whether those beliefs hold up when you press on them.
Key Takeaways
- Many SEO conventions persist through repetition and professional inertia, not because they consistently produce commercial results.
- The industry conflates activity metrics with outcome metrics, and this confusion is built into most standard reporting frameworks.
- Conventional SEO thinking undervalues content quality and overvalues content volume, which is the opposite of what Google’s behaviour suggests.
- Skill gaps in SEO are often framed as technical problems when they are more frequently strategic and commercial problems.
- The most effective SEO programmes treat search as a business function, not a channel to be optimised in isolation.
In This Article
Where SEO Convention Comes From
Every industry has its orthodoxies. SEO has more than most, partly because it developed in public, through forums and blog posts and conference talks, rather than through peer-reviewed research or controlled commercial testing. The conventions that emerged from that environment were shaped by what was observable, what was shareable, and what made for a good slide deck. Not necessarily by what was true.
I spent several years running an agency that grew from around 20 people to over 100. During that period, I watched the SEO discipline evolve from a relatively niche technical function into something that every marketing department claimed to understand and almost none of them actually did. The conventions that circulated through that period, content is king, links are everything, technical SEO is the foundation, became articles of faith rather than working hypotheses. People stopped testing them and started teaching them.
That is not a criticism of the individuals involved. It is a structural observation about how professional knowledge gets formed and transmitted in an industry that moves fast, rewards confidence, and has no real mechanism for falsifying bad ideas at scale.
If you are building or refining your broader search strategy, the Complete SEO Strategy hub covers the full picture, from technical foundations to content architecture to measurement. This article focuses specifically on the conventions that shape how most teams approach that strategy, and where those conventions tend to break down.
The Volume Assumption
One of the most durable SEO conventions is the belief that more content produces more organic traffic. Publish consistently. Cover every keyword variation. Build topical breadth. The logic is intuitive and the tooling reinforces it. Most SEO platforms are designed to surface keyword gaps, which are framed as opportunities. The implicit message is that the gap between what you have and what you could have is a problem to be solved through production.
The problem is that this convention optimises for coverage rather than quality, and the two are frequently in tension. When I was judging the Effie Awards, the work that stood out was almost never the work that had done the most things. It was the work that had made a clear decision about what mattered and executed it with precision. The same principle applies to content. A site with 40 genuinely useful, well-constructed pages will almost always outperform a site with 400 pages that exist primarily to capture keyword variants.
Google’s behaviour over the past several years has moved consistently in this direction. The helpful content updates, the quality rater guidelines, the increasing weight given to demonstrated expertise, all of these signal that the volume convention is becoming less reliable as a strategy. Yet the convention persists, partly because content volume is easy to measure and content quality is not, and partly because agencies and in-house teams alike are often incentivised to produce rather than to evaluate.
The most sustainable thing any SEO programme could do is stop funding content that should not exist. That is a harder conversation to have than “let’s publish more,” but it is the right one.
The Technical Primacy Convention
Technical SEO is important. Crawlability, site speed, structured data, canonical tags, these things matter. But the convention that technical SEO is the foundation upon which everything else rests has been stretched well beyond what the evidence supports. I have seen brands with technically impeccable sites rank poorly because their content was thin and their authority was nonexistent. I have seen sites with significant technical debt rank extremely well because they had built genuine expertise and earned real links over years.
The technical primacy convention is partly a product of where SEO expertise tends to sit in organisations. Technical SEO practitioners are often the most credible voices in the room because they can point to specific, measurable problems and specific, measurable fixes. That clarity is genuinely valuable. But it can also create a bias toward the solvable over the important. Fixing a crawl budget issue is tractable. Building topical authority is slow and uncertain. So teams fix the crawl budget and call it progress.
Moz has written thoughtfully about how to identify and address SEO skill gaps, and one of the consistent themes in that conversation is that technical skills are easier to hire for than strategic and editorial skills. That imbalance shapes what gets prioritised in practice. Teams optimise for what they can do rather than what will move the needle.
The Ranking Position Convention
The convention that ranking position is the primary indicator of SEO success is so deeply embedded in the industry that most reporting frameworks are built around it. Position one is the goal. Position three is acceptable. Anything beyond page one is failure. This framing is intuitive and it is also, in many cases, commercially misleading.
I have managed significant ad spend across a wide range of industries, and one of the consistent lessons from that experience is that traffic without conversion intent is not an asset. It is a cost. Ranking in position one for a keyword that does not align with what your business actually sells, or that attracts visitors who are nowhere near a purchase decision, produces impressive dashboards and weak commercial outcomes. The convention mistakes the signal for the substance.
The more useful question is not “what position do we rank in” but “what commercial value does this ranking generate.” That requires connecting search data to revenue data, which is technically harder and organisationally more complicated. It also tends to produce conclusions that are less flattering to the SEO programme, which is perhaps why the convention persists.
Forrester has documented the skills gap in marketing more broadly, and the gap between analytical capability and strategic application shows up consistently. The tools exist to connect SEO performance to business outcomes. The organisational will to do so, and to act on what that analysis reveals, is rarer than it should be.
The Link Quantity Convention
Links matter. This is not in dispute. What is in dispute is the convention that link quantity, expressed as domain rating, domain authority, or number of referring domains, is a reliable proxy for link quality or ranking potential. The link building industry has been built largely on this convention, and it has produced a great deal of activity that produces very little value.
I have seen brands spend significant budgets on link acquisition programmes that generated hundreds of referring domains and moved rankings barely at all. I have also seen a single well-placed editorial link from a genuinely authoritative source produce measurable ranking improvements within weeks. The convention that more links equals better performance is a simplification that serves the people selling link building services more than it serves the businesses buying them.
The more honest framing is that links are a signal of editorial trust, and editorial trust is earned through content that is genuinely worth citing. That is a slower process and a less scalable one, which is why the convention of link volume persists. Scalable beats right, in this industry as in most others.
The Keyword Research Convention
Keyword research is the starting point for almost every SEO programme, and the convention is that search volume is the primary filter for keyword prioritisation. High volume keywords are worth targeting. Low volume keywords are not. This convention is so embedded in the tooling that most keyword research platforms sort by volume by default.
The problem is that volume is a measure of demand, not a measure of commercial relevance. A keyword with 50,000 monthly searches that attracts visitors who are not your customers is worth less than a keyword with 500 monthly searches that attracts visitors who are ready to buy. The convention conflates audience size with audience quality, and the consequences of that conflation compound over time as content strategies are built around the wrong signals.
When I was working with a client in a highly specialised B2B category, the conventional keyword research approach kept surfacing broad, high-volume terms that had nothing to do with their actual buyer. The terms that their buyers actually searched, the specific, technical, low-volume queries that mapped precisely to purchase intent, were being filtered out by the volume threshold. We rebuilt the strategy around those low-volume terms and saw conversion rates from organic traffic improve substantially. The conventional approach would have produced a better-looking dashboard and worse commercial results.
The Reporting Convention
SEO reporting conventions have calcified around a set of metrics that are easy to pull from standard tools: organic sessions, keyword rankings, domain authority, backlink count, crawl errors. These metrics are not useless. But they are consistently presented as measures of programme success when they are more accurately described as measures of programme activity.
The distinction matters because activity and outcomes are not the same thing, and in my experience they frequently diverge. I have reviewed SEO reports that showed consistent month-on-month improvement across every standard metric while the business’s revenue from organic search was flat or declining. The reporting convention created a false picture of progress because the metrics being reported were not connected to the outcomes that mattered.
Analytics tools are a perspective on reality, not reality itself. This is true of all marketing measurement, but it is particularly acute in SEO, where the data is filtered through platforms that have their own methodologies, sampling approaches, and limitations. Treating Google Search Console data or a third-party rank tracker as ground truth is a category error that most practitioners make without noticing.
Hotjar’s survey tools are a useful example of a different kind of signal. Asking visitors directly how they found you, what they were looking for, and whether they found it produces qualitative data that sits alongside the quantitative picture from search tools. The combination is more honest than either alone. The convention of relying exclusively on platform data is partly a function of convenience and partly a function of the fact that qualitative signals are harder to put in a slide deck.
The Specialist Silo Convention
SEO is frequently organised as a specialist function that operates in relative isolation from the rest of the marketing programme. There is a team or an agency that handles search, and they report on search metrics, and they are evaluated on search metrics, and the connection between what they do and what the business needs is mediated through a set of proxy measures that everyone has agreed to treat as meaningful.
This silo convention produces SEO programmes that are technically competent and commercially disconnected. The search team optimises for rankings. The content team optimises for traffic. The paid team optimises for conversions. Nobody is optimising for the customer experience as a whole, because the organisational structure does not reward that kind of thinking.
Moz has explored what it means to redesign an SEO career in ways that go beyond technical specialisation, and the direction of travel in that conversation is toward broader commercial and strategic literacy. That is the right direction. The SEO practitioners who have been most effective in my experience are not the ones who know the most about technical implementation. They are the ones who understand the business well enough to make good decisions about where search effort should go.
The silo convention is also a commercial problem for agencies. Selling SEO as a standalone service, with its own deliverables and its own reporting cadence, is a clean business model. But it is not always the right model for the client. I have turned down scoped work before because the right answer for the client was an integrated approach that would not fit neatly into a search retainer. That is not a heroic decision. It is just commercially honest, and it is rarer than it should be in this industry.
What a Different Convention Would Look Like
If the conventions above are the problem, the solution is not a new set of conventions to follow uncritically. It is a different relationship with the question of what SEO is for.
SEO is a mechanism for connecting people who are looking for something with organisations that can provide it. When it works well, it creates value for both sides. When it is driven by conventions that prioritise activity over outcomes, it creates work that looks productive and produces very little of substance.
The teams that do this well share a few characteristics. They are clear about what commercial outcome they are trying to produce. They treat search data as one input among several rather than as the primary source of truth. They are willing to do less, and do it better, rather than more, and do it conventionally. And they connect their search work to the broader marketing and business strategy rather than treating it as a self-contained discipline.
Copyblogger’s reading recommendations for writers make a point that applies directly here: the best practitioners in any craft are the ones who have developed genuine taste and judgment, not just technical proficiency. SEO is no different. The conventions of the industry are a starting point, not a destination.
BCG’s work on technology and economic performance makes a related point about the difference between organisations that use technology to do existing things faster and organisations that use it to do fundamentally different things. Most SEO programmes are in the first category. The tools have improved dramatically. The underlying logic has not changed much at all.
If you want to think more systematically about how search fits into a broader commercial strategy, the Complete SEO Strategy hub is the right place to start. The individual tactics only make sense in the context of a coherent strategic framework, and that framework is where most SEO programmes are weakest.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
