SEO Basics: What Moves the Needle

SEO basics cover the foundational practices that help search engines understand, index, and rank your web pages: technical health, on-page signals, content relevance, and links pointing to your site. Get these fundamentals right and you create the conditions for organic visibility. Ignore them and no amount of tactical sophistication will compensate.

Most SEO problems I encounter are not exotic. They are unresolved basics that have been papered over with complexity. A site with crawl issues, thin content, and no coherent link profile will not be rescued by schema markup or topical authority mapping. The foundations have to be solid first.

Key Takeaways

  • SEO fundamentals , technical health, content relevance, and links , account for the majority of ranking performance. Advanced tactics built on weak foundations rarely hold.
  • Crawlability and indexation are prerequisites, not nice-to-haves. If Google cannot reliably access your pages, nothing else matters.
  • Keyword research is not about finding high-volume terms. It is about understanding what your audience is actually trying to accomplish and whether your content answers that better than the competition.
  • Link acquisition is still a significant ranking factor, but the quality-to-quantity ratio has shifted dramatically. Ten editorially earned links from relevant sites outperform a hundred directory submissions.
  • Most SEO measurement is directionally useful but not precise. Treat ranking and traffic data as signals, not verdicts.

Why Most SEO Programmes Struggle Before They Start

I have reviewed a lot of SEO audits over the years, both as an agency CEO and as someone who has inherited other agencies’ work on client accounts. The pattern is consistent: organisations invest in content production, link outreach, and technical sprints without first establishing what their site can and cannot do. They build on unstable ground and then wonder why results plateau.

The basics are not glamorous. Nobody wins an award for fixing a canonical tag issue or resolving a crawl budget problem. But they are the difference between a site that compounds organic growth over time and one that churns effort without accumulating equity. I have seen well-funded programmes with talented teams underperform because the technical infrastructure was never properly addressed. And I have seen modest budgets punch above their weight because the fundamentals were clean and the content was genuinely useful.

If you want a complete framework for how SEO fits into your broader acquisition strategy, the Complete SEO Strategy hub covers the full picture from positioning to measurement. This article focuses on the foundational layer: the mechanics you need to understand before anything else.

How Search Engines Actually Work

Search engines operate in three stages: crawling, indexing, and ranking. Crawling is the process by which bots discover and retrieve pages across the web, following links from one URL to the next. Indexing is the process of storing and organising that content so it can be retrieved. Ranking is the process of ordering indexed pages in response to a specific query.

Most SEO content skips over the first two stages and goes straight to ranking. That is a mistake. If your pages are not being crawled efficiently, they may not be indexed. If they are not indexed, they cannot rank. A site with crawl inefficiencies, blocked resources, or misconfigured directives is starting from a deficit that keyword optimisation alone will not fix.

Google’s crawl budget, loosely defined, is the number of pages Googlebot will crawl on your site within a given period. For large sites, this matters a great deal. For smaller sites, it is less of a concern, but the principle holds: make it easy for search engines to find and understand your most important pages. Redirect chains, orphaned pages, and bloated XML sitemaps all create friction that works against you.

The evolution of search engine infrastructure over the past two decades has been significant, but the core principle has not changed. Search engines want to serve the most relevant, authoritative, and trustworthy result for any given query. Your job is to make it unambiguous that your content meets that standard.

Technical SEO: The Infrastructure Layer

Technical SEO is the practice of ensuring your site is accessible, crawlable, and interpretable by search engines. It is not about tricks or exploits. It is about removing the friction between your content and the systems that need to evaluate it.

The core technical elements every site needs to get right are these:

Site architecture and internal linking

Your site’s structure communicates hierarchy and importance to search engines. Pages that are linked to frequently from within your own site are interpreted as more important than those buried three or four clicks from the homepage. A flat architecture, where important pages are accessible within two or three clicks, generally outperforms a deeply nested one for both crawlability and user experience.

Internal linking is also how you distribute what SEOs call link equity, the authority signal passed from one page to another. A strong piece of content that attracts external links can pass some of that signal to related pages through thoughtful internal links. This is one of the most underused levers in SEO, particularly on content-heavy sites where pages accumulate without any deliberate linking strategy.

Page speed and Core Web Vitals

Google has incorporated page experience signals into its ranking systems, including the Core Web Vitals metrics: Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift. These measure loading performance, interactivity, and visual stability respectively.

The honest position on Core Web Vitals is that they are a ranking factor, but not a dominant one. A slow page with genuinely excellent, authoritative content will generally outrank a fast page with mediocre content. That said, page speed has a direct effect on user behaviour, and user behaviour data consistently shows that slow load times increase abandonment rates. The business case for speed is not just about rankings.

HTTPS and security

HTTPS has been a confirmed Google ranking signal since 2014. If your site is still running on HTTP, fix it. This is not a marginal optimisation. It is a baseline requirement, and browsers now actively warn users about non-secure sites. The impact on trust, let alone rankings, is real.

Mobile usability

Google uses mobile-first indexing, which means it primarily uses the mobile version of your content for indexing and ranking. If your mobile experience is degraded relative to desktop, that is the version being evaluated. Responsive design is the standard approach. Check that content is not hidden or truncated on mobile, that tap targets are appropriately sized, and that fonts are legible without zooming.

Structured data

Structured data, typically implemented using Schema.org markup, helps search engines understand the content and context of your pages. It does not directly improve rankings, but it enables rich results in search, including FAQ panels, review stars, product pricing, and event details. These enhanced listings can meaningfully improve click-through rates. The FAQ schema embedded at the bottom of this article is a practical example.

Keyword Research: What It Is and What It Is Not

Keyword research is the process of identifying the terms and phrases your target audience uses when searching for information, products, or services related to your business. Done well, it tells you what demand exists, how competitive that demand is, and where your content can realistically compete.

Done badly, it produces a list of high-volume terms with no connection to what your business actually does or what your audience actually needs. I have seen keyword strategies built around terms that looked impressive in a spreadsheet but drove traffic that converted at near-zero rates. Volume without intent alignment is noise.

The practical process looks like this:

Start with your audience, not the tools

Before opening a keyword tool, write down the questions your customers actually ask. What problems are they trying to solve? What language do they use? What comparisons are they making? This is the raw material. The tools help you validate and expand it, but they cannot replace the underlying understanding of your audience.

Early in my agency career, I spent a week sitting with a client’s customer service team before building out a content strategy. The language customers used when calling in was completely different from the polished terminology the marketing team preferred. The customer service team’s vocabulary was what people were actually searching. That gap between internal language and customer language is where a lot of keyword strategies fall apart.

Understand search intent

Every search query has an intent behind it: informational (I want to learn something), navigational (I want to find a specific site), commercial (I am comparing options before buying), or transactional (I am ready to buy). Your content needs to match the intent of the keywords you are targeting, not just include the words.

A page optimised for a transactional query that delivers an informational article will not rank well for that query, even if the content is excellent. Google has spent years learning to distinguish intent, and it is very good at it. Targeting a keyword without understanding the intent behind it is one of the most common and costly mistakes in SEO.

Assess difficulty honestly

Keyword difficulty scores in tools like Ahrefs, Semrush, and Moz are useful approximations, not precise measurements. They typically reflect the link authority of pages currently ranking for a term. A high difficulty score means the current ranking pages have significant link equity. It does not mean the query is impossible to compete for, particularly if the existing content is weak or does not fully address the intent.

When I was growing an agency from around 20 people to over 100, we won a lot of new business by targeting prospects who were underserved by the dominant players in their category. The same logic applies to keyword strategy. The question is not just “how hard is this keyword?” but “how well are the current results actually serving the searcher?”

On-Page Optimisation: The Signals That Still Count

On-page optimisation is the practice of structuring and writing your content so that search engines can clearly understand what it is about and why it is relevant to specific queries. The fundamentals here have not changed dramatically, even as Google’s algorithms have become more sophisticated.

Title tags

The title tag is still one of the most important on-page signals. It appears in search results as the clickable headline and tells both users and search engines what the page is about. Include your primary keyword, ideally near the front. Keep it under 60 characters to avoid truncation in search results. Write it for humans first, but do not ignore the keyword.

Google rewrites title tags in search results when it determines the original tag does not accurately represent the page content. If you find your titles being rewritten consistently, it is usually a sign that your tags are either too keyword-heavy, too vague, or misaligned with the actual content of the page.

Meta descriptions

Meta descriptions are not a direct ranking factor, but they influence click-through rates, which have an indirect effect on performance. A well-written meta description that accurately describes the page and gives a searcher a reason to click is worth the effort. Keep it between 130 and 155 characters. Avoid starting with commands like “Learn how to” or “Discover.” Write a statement that communicates the value of the page.

Heading structure

Use one H1 per page, containing your primary keyword. Use H2s for main sections and H3s for subsections within those. This is not just about SEO. It is about readability and information hierarchy. A page with a clear, logical heading structure is easier for both search engines and humans to process.

Content depth and coverage

There is a persistent myth that longer content always ranks better. Length is not the variable. Coverage is. A page that comprehensively addresses a query, answers the follow-up questions a searcher is likely to have, and does so with genuine clarity will outperform a longer page that pads its word count without adding value.

I judged the Effie Awards for a period, which gave me a useful perspective on what “effective” actually means in practice. The campaigns that won were not the most elaborate. They were the most precisely matched to a specific audience need. The same principle applies to content. Precision beats volume.

URL structure

URLs should be short, descriptive, and include the primary keyword where natural. Avoid dynamic parameters where possible. Use hyphens to separate words, not underscores. A URL like /seo-basics/ is preferable to /page?id=4721&cat=12. Clean URLs are easier to share, easier to read, and send a clearer signal about page content.

Image optimisation

Images should have descriptive file names and alt text. Alt text serves two purposes: it helps visually impaired users understand the image content, and it provides a contextual signal to search engines. Do not stuff keywords into alt text. Describe the image accurately. If the image is decorative and adds no informational value, an empty alt attribute is appropriate.

Content Strategy: The Difference Between Ranking and Mattering

Content is the substance of SEO. Without it, there is nothing to rank. But content strategy is frequently misunderstood as a production challenge rather than a relevance challenge. The question is not “how much content should we publish?” It is “what content will genuinely serve our audience better than what currently exists?”

The most effective content programmes I have seen share a common characteristic: they are built around a clear understanding of the audience’s information needs at different stages of the decision process. They are not built around keyword volume spreadsheets or editorial calendars designed to fill a publishing schedule.

Early in my time at an agency, we inherited a content programme from a previous agency that had produced over 200 blog posts in 18 months. Almost none of them ranked for anything meaningful. The posts were not bad writing. They were just not connected to anything the target audience was searching for, and they did not demonstrate any particular expertise. They were content for content’s sake. We consolidated the best 30 posts, updated them with genuine depth and proper keyword alignment, and retired the rest. Organic traffic from that content set more than doubled within six months.

A few principles that hold up in practice:

Cover topics, not just keywords. A single well-constructed piece that comprehensively addresses a topic will typically outperform a cluster of thin posts each targeting a slightly different keyword variation. Build content around the full scope of what a searcher needs to know, not around individual search terms.

Update existing content before creating new content. A page that was ranking on page two six months ago and has since dropped is often easier to recover than it is to build a new page to the same position. Audit your existing content regularly. Look for pages with declining impressions or positions, assess whether the content is still accurate and comprehensive, and update before you create.

Demonstrate genuine expertise. Google’s quality guidance places significant emphasis on what it calls Experience, Expertise, Authoritativeness, and Trustworthiness. This is not just about credentials. It is about whether the content reflects real knowledge and first-hand experience. Content that could have been written by anyone, about anything, for any audience, is not going to compete in categories where subject matter expertise is expected.

Links from other websites to yours remain one of the strongest ranking signals in Google’s algorithm. A link from a credible, relevant site is interpreted as a vote of confidence in your content. The more credible and relevant the linking site, the stronger the signal.

The link building landscape has changed significantly since the early days of SEO, when volume was the dominant variable. Manipulative link schemes, paid link networks, and low-quality directory submissions have been progressively devalued or penalised. What matters now is editorial quality: links that exist because another site genuinely found your content worth referencing.

The most sustainable link acquisition approaches are also the most straightforward. Create content that is genuinely useful and well-researched, and people will link to it. Build relationships with journalists, bloggers, and publishers in your industry, and they will reference your work when it is relevant. Produce original data, analysis, or perspectives that are not available elsewhere, and you give people a reason to cite you.

There are also more proactive approaches that are entirely legitimate: digital PR, where you create newsworthy content or data and pitch it to relevant publications; guest contributions to authoritative industry sites; and broken link building, where you identify broken links on relevant sites and offer your content as a replacement. None of these are shortcuts. They require effort and genuine quality. But they produce links that hold their value over time.

One thing I have noticed consistently: organisations that treat link building as a separate activity from content strategy tend to struggle. The ones that produce content worth linking to and then actively promote it to relevant audiences tend to accumulate links more naturally. The two activities should be integrated, not siloed.

Building the skills to evaluate and execute link acquisition is not trivial. Identifying and filling SEO skill gaps within your team is often a prerequisite for making real progress in this area, and the soft skills required for effective SEO are frequently underestimated. Relationship-building, communication, and editorial judgment matter as much as technical knowledge when it comes to earning links.

Local SEO: A Distinct Set of Basics

If your business serves a specific geographic area, local SEO requires its own attention. The fundamentals overlap with general SEO, but there are additional signals that matter specifically for local search visibility.

Google Business Profile is the starting point. A complete, accurate, and regularly updated profile is foundational for appearing in local search results and the map pack. Ensure your business name, address, and phone number are consistent across your website and all online directories. Inconsistencies in this data create ambiguity that can suppress local rankings.

Reviews are a significant local ranking factor and a trust signal for potential customers. A consistent stream of genuine, positive reviews from verified customers is more valuable than a large volume of reviews accumulated in a short period. The latter can trigger quality filters. Encourage reviews as part of your normal customer communication, not as a one-time push.

Local content, meaning content that references your geographic area and addresses the specific needs of a local audience, supports local relevance signals. This does not mean stuffing location names into existing content. It means creating content that is genuinely useful to people in your area and that reflects local context.

Measuring SEO: What the Numbers Actually Tell You

SEO measurement is more complicated than most practitioners admit. The data we have access to is useful, but it is not a transparent window into performance. It is a set of approximations, and treating it as precise measurement leads to bad decisions.

Google Search Console is the closest thing to a primary source. It shows you the queries your pages are appearing for, the positions they are appearing in, and the click-through rates they are achieving. The data has limitations: position averages can mask significant variation, and the query data is sampled at scale. But it is more reliable than third-party rank trackers for understanding what is actually happening in search.

Organic traffic in your analytics platform is a useful directional metric, but it conflates a lot of things. Branded searches (people searching for your company name directly) are included in organic traffic. So are navigational queries from existing customers. Neither of these represents the kind of demand generation that SEO is typically trying to achieve. Segment your organic traffic by branded versus non-branded to get a clearer picture of acquisition performance.

I have a general scepticism about attribution models that assign precise revenue credit to individual marketing channels. The modelled approach to marketing contribution that Forrester and others have written about acknowledges that marketing’s impact is rarely linear or channel-specific. SEO contributes to awareness, consideration, and conversion across time horizons that last-click attribution will never capture. Report on what you can measure honestly, and be clear about what the numbers cannot tell you.

The metrics I find most useful in practice are: non-branded organic sessions (demand acquisition), organic conversion rate (quality of traffic), keyword position distribution (how many keywords rank in positions 1-3 versus 4-10 versus 11-20), and indexed page count relative to total page count (technical health indicator). None of these alone tells the full story, but together they give a reasonable picture of programme health.

Common SEO Mistakes That Are Entirely Avoidable

After reviewing hundreds of SEO programmes across 30 industries, the same mistakes appear with remarkable consistency. They are not exotic. They are the result of misplaced priorities, insufficient rigour, or a failure to do the basics before pursuing the advanced.

Targeting keywords without understanding intent. A page optimised for “project management software” that delivers a blog post when the searcher wants a product comparison will not rank for that term. Align content format and depth with what the query intent demands.

Creating content without a distribution plan. Good content that nobody links to and nobody promotes will not rank in competitive categories. Content and promotion need to be planned together, not sequentially.

Ignoring technical issues because they are unglamorous. Crawl errors, duplicate content, and slow page speed are not interesting problems to solve. They are also not optional. A site with unresolved technical issues is working against itself regardless of how good the content is.

Treating rankings as the end goal. Rankings are a means to an end. The end is traffic that converts into business outcomes. A site ranking number one for a term that drives no conversions has achieved nothing commercially useful. Always connect SEO activity to business metrics.

Chasing algorithm updates reactively. Google makes thousands of changes to its ranking systems each year. Most of them are minor. The organisations that perform consistently well over time are those that focus on fundamentals: useful content, clean technical infrastructure, and earned links. Chasing each update is a distraction from the work that actually compounds.

Misreading correlation as causation in SEO data. This is one that I feel strongly about. A ranking improvement that coincides with a technical change does not prove the change caused the improvement. Seasonality, competitor movements, and algorithm updates all affect rankings independently. Before drawing conclusions from SEO data, ask whether the methodology for attributing the change is actually sound. I apply the same scrutiny to SEO claims that I apply to any marketing research. Was the methodology strong? Are the differences meaningful? Or is this just a convenient narrative?

Building an SEO Programme That Compounds Over Time

The organisations that get the most from SEO are not those with the largest budgets or the most sophisticated toolstacks. They are the ones that treat SEO as a long-term investment with compounding returns, rather than a short-term channel to be optimised for immediate results.

Compounding in SEO works like this: a page that ranks well attracts links, which increases its authority, which improves its ranking, which attracts more links. Content that is genuinely useful gets shared, referenced, and updated over time, accumulating equity that a page created last month cannot match. A site with a strong technical foundation, a coherent content architecture, and an earned link profile becomes progressively harder for competitors to displace.

This compounding effect takes time to materialise. The organisations that abandon SEO programmes after six months because they have not seen significant results are, in most cases, walking away just as the foundations are beginning to produce returns. The typical timeline for meaningful organic growth from a properly executed SEO programme is 9 to 18 months from a standing start. That is not a comfortable sell in a quarterly-reporting environment, but it is an honest one.

Building internal capability matters as much as external execution. An organisation that understands SEO at a strategic level, not just as a technical function, will make better decisions about content, site architecture, and channel investment. The skills required for effective SEO extend well beyond technical knowledge. Editorial judgment, commercial thinking, and the ability to communicate clearly across teams are all essential.

For a broader look at how SEO connects to your overall acquisition strategy and where it sits within a complete marketing programme, the Complete SEO Strategy hub is worth working through in full. The basics covered in this article are the entry point. The strategy layer is where the real decisions get made.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What are the most important SEO basics for a new website?
For a new website, the priorities are: ensure the site is crawlable and indexable (check robots.txt and XML sitemap), implement HTTPS, create a clean URL structure, write accurate title tags and meta descriptions for each page, and produce content that genuinely addresses the search intent of your target keywords. Technical health and content relevance are the foundations everything else builds on.
How long does it take to see results from SEO?
For a new site or a programme starting from scratch, meaningful organic growth typically takes 9 to 18 months. Established sites with existing authority can see results from targeted optimisation work within 3 to 6 months. The timeline depends on the competitiveness of your target keywords, the quality of your existing content, and the strength of your link profile relative to competitors. Anyone promising significant results within 30 to 60 days is either targeting very low-competition terms or overstating what is achievable.
Is keyword research still necessary if you write high-quality content?
Yes. High-quality content that is not aligned with the terms your audience actually searches for will not rank for those terms, regardless of its quality. Keyword research is how you understand the language your audience uses and the intent behind their searches. It does not replace good writing or genuine expertise. It ensures that good writing and genuine expertise are applied to topics where there is actual search demand.
How many backlinks do you need to rank on the first page of Google?
There is no universal number. The links required to rank on page one depend entirely on what the competing pages have. In low-competition niches, a handful of quality links may be sufficient. In highly competitive categories, the top-ranking pages may have thousands of referring domains built over years. The more useful question is: how does your link profile compare to the pages currently ranking for your target keywords? Tools like Ahrefs and Semrush allow you to assess this directly.
What is the difference between on-page SEO and technical SEO?
On-page SEO refers to the optimisation of individual page content and HTML elements: title tags, meta descriptions, heading structure, keyword usage, and content depth. Technical SEO refers to the infrastructure that enables search engines to crawl, index, and interpret your site: site speed, mobile usability, crawl directives, structured data, and site architecture. Both are necessary. Technical SEO creates the conditions for on-page work to have effect. On-page SEO determines what search engines find when they get there.

Similar Posts