SEO Convention: Rules Worth Breaking and Rules Worth Keeping
SEO convention refers to the widely accepted practices, defaults, and orthodoxies that shape how most marketers approach search optimisation. Some of these conventions are grounded in genuine evidence. Others are inherited assumptions that have never been seriously tested, repeated often enough that they feel like facts.
Knowing which is which is one of the most commercially valuable skills in SEO. Following convention blindly produces average results. Ignoring it entirely produces expensive mistakes. The real work is in the distinction.
Key Takeaways
- Many SEO conventions are inherited assumptions, not evidence-based rules. Treating them as fixed laws limits your strategic thinking.
- Following standard practice in a competitive market often means competing for the same positions with the same tactics. Differentiation requires deliberate deviation.
- Some conventions exist for good reasons and breaking them has a cost. The skill is knowing which category any given rule falls into before you act.
- Benchmarking against industry convention without benchmarking against your actual competitors tells you very little about your true competitive position.
- The most dangerous SEO mistakes often come not from ignoring the rules but from following them without understanding why they exist.
In This Article
- Where SEO Convention Comes From
- The Conventions That Are Worth Keeping
- The Conventions That Deserve More Scrutiny
- When Following Convention Becomes a Competitive Liability
- The Problem With SEO Checklists
- How to Evaluate Any SEO Convention Before You Follow It
- Convention and the Measurement Problem
- What Commercially Grounded SEO Looks Like in Practice
Where SEO Convention Comes From
Most SEO convention originates from one of three places: Google’s own guidance, the accumulated output of the SEO community, or cargo-culted practices that spread because they appeared to work for someone else at some point.
Google’s documentation and public statements are useful but deliberately incomplete. The company does not publish its full ranking algorithm, and it has a commercial incentive to shape how the SEO industry behaves. Taking every piece of Google communication at face value is not critical thinking. It is deference.
The SEO community produces enormous volumes of content. Some of it is excellent. A lot of it is correlation dressed up as causation. A site with strong backlinks ranks well, so the conclusion becomes “build more backlinks.” A site with long-form content dominates a niche, so the conclusion becomes “write longer content.” These observations are not wrong, but they are incomplete, and they become dangerous when applied without context.
I spent several years running an agency where we grew from around 20 people to over 100. One of the things I noticed consistently was how quickly junior strategists would apply a tactic they had seen succeed in one industry to a completely different context, because the convention said it worked. It sometimes did. Often it did not. The convention had no mechanism for telling them which situation they were in.
If you want a grounded view of how to build a complete SEO strategy rather than a collection of inherited tactics, the Complete SEO Strategy hub covers the full picture, from technical foundations to content architecture to competitive positioning.
The Conventions That Are Worth Keeping
Not all convention deserves scrutiny. Some of it reflects hard-won understanding of how search engines work and what users actually want. Dismissing it to appear contrarian is its own kind of intellectual laziness.
Technical hygiene is largely non-negotiable. Pages need to load quickly, be crawlable, and render correctly. Canonical tags need to be set properly. Redirects need to resolve cleanly. These are not opinions. They are prerequisites. The convention around technical SEO exists because the underlying mechanics are real, and ignoring them creates compounding problems that are expensive to unwind.
Search intent alignment is similarly grounded. If someone searches for a comparison of two products and you serve them a product page, you will not rank well regardless of how technically polished that page is. The convention that content should match what users are actually looking for reflects something true about how search engines evaluate relevance. It is not arbitrary.
The guidance around E-E-A-T, experience, expertise, authoritativeness, and trustworthiness, is also worth taking seriously, particularly for content in categories where accuracy matters. The team at Search Engine Journal has written thoughtfully about how Google evaluates signals of credibility, and the underlying logic is sound even if the specific mechanics are opaque.
The conventions worth keeping are the ones with a clear mechanism behind them. You can explain why they work, not just observe that they tend to.
The Conventions That Deserve More Scrutiny
The conventions that cause the most damage are the ones that feel authoritative because they are repeated constantly, not because they have been properly tested.
Word count targets are a persistent example. The idea that content needs to be 2,000 words, or 3,000, or whatever the current benchmark is, to rank well is not a rule. It is a proxy. Long-form content often ranks well because it tends to be more comprehensive, not because length itself is a ranking factor. When teams optimise for word count rather than comprehensiveness, they produce padded content that satisfies a metric without serving the user. I have reviewed content audits from clients where pages were hitting 3,500 words and ranking nowhere, because the length had been achieved by repeating the same points in slightly different language.
The convention around publishing frequency is another one worth examining. Consistent publishing is valuable. Publishing at a pace your team cannot sustain in order to hit an arbitrary cadence is not. Thin content produced to fill a content calendar does more harm than good, and the SEO community has known this for years while continuing to recommend aggressive publishing schedules.
Keyword density is largely a relic. The idea that a specific percentage of keyword repetition improves rankings has not reflected how modern search engines work for a long time. Yet it persists in briefs, audits, and recommendations because it is measurable and therefore feels rigorous. Measurability is not the same as relevance.
The team at Moz has written about how SEO is shifting toward signals that are harder to game and easier to fake with conventional checklists. That shift makes the gap between teams that understand why rules exist and teams that just follow them wider every year.
When Following Convention Becomes a Competitive Liability
Here is a scenario I have seen play out more than once. A business looks at what its competitors are doing in search, identifies the conventions they are following, and builds a strategy to do the same things slightly better. More content, more backlinks, faster pages, better on-page optimisation. The strategy is executed competently. Rankings improve marginally. The business concludes that SEO is working.
What it has not done is ask whether following the same conventions as everyone else in the category is the right approach at all. If every competitor is targeting the same head terms with the same content formats and the same link-building approaches, competing harder on those dimensions is expensive and often insufficient. The gains are incremental because the strategy is incremental.
I think about this the same way I think about market share versus absolute revenue. I have seen businesses celebrate a 10% revenue increase without noticing that the market grew by 20% in the same period. The number looks good. The position is worse. Following SEO convention in a category where all competitors follow the same convention produces the same illusion. You are improving in isolation while your relative position stays flat or deteriorates.
The businesses that break out of this pattern tend to do one of two things. They find a content angle that competitors have ignored, often because it does not fit neatly into the conventional keyword research process. Or they build a genuine authority in a narrower space rather than competing broadly on conventional terms. Neither of these approaches comes from following the standard playbook more diligently.
The Problem With SEO Checklists
Checklists are useful. I have used them throughout my career in agency operations, in client onboarding, in campaign launches. They reduce the cost of routine decisions and prevent avoidable errors. But they have a failure mode that is worth understanding: they work well when the situation is standard and break down when it is not.
SEO checklists encode convention. Run through the list, tick the boxes, and you have done what convention requires. The problem is that ticking boxes and making good decisions are not the same thing. A checklist can tell you whether your title tag is under 60 characters. It cannot tell you whether you are targeting the right query, whether the content angle you have chosen is differentiated, or whether the page you are optimising should exist at all.
I have managed teams where the SOPs were excellent and the output was mediocre, because people were executing the process without engaging with the underlying question of whether the process was right for the situation. The checklist became a substitute for thinking rather than a support for it. That is a management problem as much as a strategic one, but it is also a structural problem with how SEO convention gets operationalised.
The teams that use checklists well treat them as a floor, not a ceiling. The checklist ensures nothing obvious is missed. The thinking above the checklist is where the competitive advantage comes from.
How to Evaluate Any SEO Convention Before You Follow It
There is a straightforward set of questions worth asking before treating any SEO convention as fixed guidance.
First: what is the mechanism? If you cannot explain why a practice is supposed to work, not just that it tends to, you are following it on faith. That is sometimes acceptable when the stakes are low, but not when you are allocating significant budget or team time.
Second: what is the evidence base? Is this convention derived from controlled testing, from correlation in large datasets, or from anecdote? The SEO industry has a weak relationship with controlled experimentation. Most “best practices” are based on observational data, which is useful but not conclusive.
Third: does this apply to my specific situation? A convention that holds across most niches may not hold in yours. Highly specialised B2B categories, for example, often behave differently from consumer categories. Local search behaves differently from national search. International SEO introduces variables that most convention assumes away. Tools like the Moz domain overview can give you a useful starting point for competitive benchmarking, but benchmarking against industry norms rather than your actual competitors is a common and expensive mistake.
Fourth: what is the cost of being wrong? Some SEO conventions, if you break them, have a recoverable cost. Others, particularly around technical architecture or site structure, can create problems that take months to unwind. The appropriate level of scrutiny should scale with the cost of a mistake.
Convention and the Measurement Problem
One reason SEO convention persists even when it does not serve teams well is that the measurement environment makes it hard to isolate what is actually working. SEO operates over long timeframes. Multiple changes happen simultaneously. External factors, algorithm updates, competitor activity, seasonal demand shifts, all affect rankings in ways that are difficult to attribute cleanly.
In that environment, convention provides a kind of comfort. If you followed the standard approach and results were disappointing, the convention provides cover. If you deviated from convention and results were disappointing, the deviation is blamed. This asymmetry rewards conformity even when conformity is not the right answer.
I spent time judging the Effie Awards, which evaluate marketing effectiveness. One of the things that struck me was how rarely the winning work was conventional. The campaigns that drove genuine business outcomes tended to involve a deliberate choice to do something different from category norms. That is not a coincidence. It reflects the fact that convention, by definition, is what everyone is already doing. Differentiation requires deviation.
The measurement problem in SEO does not mean you should ignore data. It means you should be honest about what the data actually tells you. A ranking improvement is not the same as a revenue improvement. Traffic growth is not the same as qualified traffic growth. Convention often optimises for the metrics that are easiest to measure rather than the ones that matter most commercially.
What Commercially Grounded SEO Looks Like in Practice
The alternative to following convention blindly is not to ignore it. It is to hold it more lightly, to treat it as a starting point rather than an answer, and to build in the kind of deliberate questioning that most SEO processes do not have room for.
In practice, this means starting with the business problem rather than the SEO problem. What does this business need from search? More qualified leads in a specific category? Improved visibility in a geography where a competitor is dominant? Better coverage of high-intent queries that currently convert through paid channels? The answer to that question shapes which conventions are relevant and which are not.
It means doing competitive analysis that goes beyond keyword gap reports. Understanding why a competitor ranks well, not just that they do, is what allows you to find the angles they have missed or the positions they have conceded. Experience-led approaches to digital increasingly recognise that ranking is a means to an end, not the end itself. The page that ranks has to convert, and conversion depends on things that conventional SEO optimisation does not always address.
It means being honest about the difference between SEO activity and SEO results. Publishing content is activity. Ranking for queries that drive qualified traffic is a result. The two are related but not the same, and convention often conflates them because activity is easier to measure and easier to report.
And it means building a team culture where questioning the standard approach is encouraged rather than treated as a sign of inexperience. The most commercially effective SEO teams I have worked with were not the ones with the most rigorous checklists. They were the ones where people asked “why are we doing this?” often enough to catch the cases where the convention did not fit.
If you want to see how these principles apply across the full scope of an SEO programme, rather than at the level of individual tactics, the Complete SEO Strategy hub brings together the strategic and operational elements in a way that is designed to be commercially useful rather than just technically comprehensive.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
