8 SEO Practices That Google Has Moved On From
Several SEO tactics that were standard practice five to ten years ago now actively work against you. Google’s algorithm has matured considerably, and what once moved rankings, keyword stuffing, exact-match anchor text manipulation, low-quality link building, now triggers penalties or simply gets ignored. The practices below are not edge cases. They were mainstream, widely recommended, and many agencies still charge for them.
If your SEO programme is built on any of these eight foundations, you are not just wasting budget. You are likely suppressing the performance you could otherwise be generating.
Key Takeaways
- Keyword density as a ranking signal is functionally dead. Google evaluates topical authority and semantic relevance, not word frequency.
- Exact-match anchor text in link building is now a manipulation signal, not a quality one. Natural variation outperforms engineered ratios.
- Guest posting at scale for links, not editorial value, has been explicitly discounted by Google and can attract manual penalties.
- Thin content pages built around long-tail variants do not compound. They dilute crawl budget and create internal cannibalisation.
- Technical SEO hygiene still matters, but chasing PageSpeed scores in isolation, without fixing the underlying content and authority problems, produces marginal returns at best.
In This Article
- 1. Keyword Density Targets
- 2. Exact-Match Anchor Text Engineering
- 3. Guest Posting at Scale for Links
- 4. Thin Content Pages Built Around Long-Tail Variants
- 5. Treating PageSpeed Score as a Primary Ranking Lever
- 6. Ignoring Information Architecture in Favour of Content Volume
- 7. Social Signals as a Direct Ranking Factor
- 8. Meta Keyword Tags
- What to Do Instead
When I was scaling the SEO practice at iProspect, we grew from a team handling a handful of accounts to one managing programmes across 30 industries, eventually becoming a top-five office by revenue in a network of around 130 globally. A significant part of that growth came from doing the opposite of what a lot of competitors were still selling: we stopped optimising for the algorithm as it was and started building for where it was clearly heading. That instinct, more than any specific tactic, is what separated durable performance from short-term wins that evaporated with the next core update.
1. Keyword Density Targets
The idea that repeating a keyword phrase a specific number of times, typically cited as somewhere between 1% and 3% of total word count, would signal relevance to Google made sense in an era of simpler algorithms. That era ended a long time ago.
Google now processes language semantically. It understands synonyms, related concepts, and topical clusters. A page that mentions “running shoes” seventeen times does not outrank a page that covers the subject with genuine depth and variety. If anything, forced repetition reads as low quality, both to the algorithm and to the person who lands on the page.
Write for the topic, not the phrase. Cover the subject with the breadth and specificity your audience expects. The keyword will appear naturally, and that is exactly the signal Google is looking for.
2. Exact-Match Anchor Text Engineering
There was a period when building links with anchor text that precisely matched your target keyword was considered best practice. SEOs would obsessively manage ratios: 20% exact match, 30% partial match, 50% branded or generic. Entire link building strategies were built around hitting those numbers.
Google’s Penguin update changed that calculation permanently. Unnatural anchor text patterns are now a manipulation signal. A link profile that looks engineered, because it is, attracts scrutiny. Natural link profiles have varied, contextually appropriate anchor text because that is how real editorial links work. Someone writing about your product uses the words that make sense in their sentence, not the phrase you want to rank for.
If your link building brief still includes anchor text ratios as a primary deliverable, that brief needs rewriting.
3. Guest Posting at Scale for Links
Guest posting is not dead. Publishing genuinely useful content on relevant, authoritative sites still builds brand, drives referral traffic, and earns links that carry real weight. That is not what this is about.
What no longer works is the industrialised version: bulk outreach to low-authority blogs, templated content written purely to house a backlink, and link placement as the sole objective. Google has been explicit about this. Links in content that exists primarily for SEO purposes are treated as paid links, regardless of whether money changed hands.
I have seen agencies sell guest posting packages at scale as a core deliverable, with metrics measured in volume of placements rather than quality of publications. That approach produces a link profile that looks busy but performs poorly, and it creates liability when the next update arrives. The Moz SEO quick-start guide frames link building correctly: earn links through content worth linking to, not through placement engineering.
4. Thin Content Pages Built Around Long-Tail Variants
The logic seemed sound at the time. If you could create a separate page for every long-tail variant of a keyword, “best running shoes for flat feet”, “best running shoes for flat feet women”, “best running shoes for flat feet wide width”, you could capture each search individually. Multiply that across a large site and the traffic potential looked significant on paper.
In practice, this approach creates several problems simultaneously. It dilutes crawl budget across pages with minimal individual value. It creates cannibalisation issues where multiple pages compete for the same or closely related queries. And it produces a poor user experience, which Google now measures and weights through Core Web Vitals and engagement signals.
A single, comprehensive page that addresses a topic with genuine depth will consistently outperform five thin variants targeting minor keyword permutations. Consolidation, not proliferation, is the correct direction for most content programmes that built on this model.
If you want to understand how this fits into a broader content and site architecture strategy, the full picture is in the Complete SEO Strategy hub.
5. Treating PageSpeed Score as a Primary Ranking Lever
Core Web Vitals are a real ranking factor. Page speed matters. But there is a category of technical SEO work that treats PageSpeed Insights scores as an end in themselves, chasing numbers in a vacuum while the actual content and authority problems on a site go unaddressed.
I have reviewed audits that ran to forty pages of technical recommendations, image compression ratios, render-blocking scripts, server response times, with no meaningful discussion of whether the content on those pages deserved to rank in the first place. A fast page with thin content is still a thin page. Google’s algorithm weighs technical performance as one signal among many, and it is rarely the determining factor when content quality and authority are the real gaps.
Fix genuine technical problems that affect crawlability and user experience. Do not mistake a green Lighthouse score for an SEO strategy.
6. Ignoring Information Architecture in Favour of Content Volume
Publishing more content is not the same as building a stronger site. This distinction matters more now than it did five years ago, partly because Google’s quality assessments have become more sophisticated, and partly because the volume of content being published has increased to the point where undifferentiated output simply does not surface.
Information architecture, how your content is structured, categorised, and interlinked, directly affects how Google understands your site’s topical authority. A well-structured site signals depth and coherence. A site with hundreds of loosely connected posts, no clear hierarchy, and inconsistent internal linking signals the opposite. Search Engine Land’s overview of information architecture for SEO covers the structural principles that still hold.
When we were building SEO as a high-margin service line, the work that consistently delivered long-term results was not the content production. It was the architecture decisions: which topics to own, how to structure the site around them, and how to connect content in ways that reinforced topical depth rather than just adding pages.
7. Social Signals as a Direct Ranking Factor
The claim that social shares, likes, and engagement directly influence Google rankings has circulated for years. It surfaces regularly in content marketing presentations as a reason to prioritise social distribution. Google has consistently said social signals are not a direct ranking factor, and the mechanism by which they could be would create obvious manipulation problems.
The indirect relationship is real: content that earns genuine social traction tends to earn links, which do influence rankings. But optimising for social metrics as a proxy for SEO performance conflates correlation with causation. The content that performs well socially and the content that ranks well share a common cause, genuine quality and relevance, rather than one driving the other.
Build content worth sharing. Do not build content primarily designed to be shared, then expect ranking improvements as a result.
8. Meta Keyword Tags
This one should be settled, but it keeps appearing in SEO checklists and agency deliverables. Google stopped using the meta keywords tag as a ranking signal in 2009. Bing followed. No major search engine currently uses it. Populating meta keywords is not harmful, but it is wasted effort, and it tells you something about the currency of the advice you are receiving if it still appears as a recommended action.
The meta fields that still matter are the title tag and meta description. The title tag carries genuine weight as a relevance signal. The meta description does not directly influence rankings but affects click-through rate, which affects the traffic your rankings actually deliver. Those deserve attention. The keywords tag does not.
There is a broader point here. SEO checklists age badly. Tactics that were correct in 2012 or 2015 or 2019 are not necessarily correct now, and some are actively counterproductive. The discipline requires ongoing recalibration, not adherence to a fixed list of best practices. Moz’s Whiteboard Friday on explaining SEO value is useful context for how to think about SEO as a dynamic discipline rather than a static checklist.
What to Do Instead
The thread connecting all eight of these practices is the same: they optimise for a signal rather than for the underlying quality that signal was designed to measure. Google’s algorithm has consistently moved toward rewarding the real thing, genuine expertise, authentic authority, content that serves the reader, and away from rewarding the simulation of those qualities.
That shift is not a threat to good SEO. It is a vindication of it. The work that builds durable organic performance, developing genuine topical authority, earning links through content worth referencing, structuring sites for clarity and depth, has always been the right approach. The tactics above were shortcuts that worked until they did not.
One practical observation from managing large-scale programmes: the clients who saw the most consistent long-term growth were the ones willing to invest in the slower, less immediately measurable work. Building topical authority takes time. Earning editorial links takes time. Restructuring a site’s information architecture takes time. The clients who wanted faster signals often ended up on the wrong side of algorithm updates, rebuilding from a lower base.
Auditing your current programme against these eight practices is a reasonable starting point. If any of them are still active deliverables, the question is not just whether to stop. It is what to replace them with, and whether the agency or team running your SEO has a clear answer to that question.
For a structured view of how all the components of an effective SEO programme fit together, the Complete SEO Strategy hub covers everything from technical foundations to content architecture to measurement, without the tactics that no longer hold up.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
