Xenu SEO: The Free Broken Link Crawler That Still Works
Xenu’s Link Sleuth is a free Windows-based crawler that scans websites for broken links, redirect chains, and missing pages. It was built in the late 1990s, last updated in 2010, and it still does the job better than tools that cost hundreds of dollars a month for this specific task. If you run technical SEO audits and you haven’t used it, you’re probably paying for something you didn’t need to.
Key Takeaways
- Xenu’s Link Sleuth is a free crawler built for broken link detection, and its age doesn’t diminish its usefulness for that specific task.
- The tool surfaces broken internal links, redirect chains, and orphaned pages faster than most paid alternatives for small to mid-size sites.
- Xenu works best as a diagnostic tool within a broader SEO workflow, not as a standalone audit platform.
- Its limitations are real: no JavaScript rendering, no cloud access, Windows-only, and no integration with modern reporting stacks.
- The right question isn’t whether Xenu is outdated. It’s whether you’re fixing what it finds, or just generating reports.
In This Article
- What Is Xenu’s Link Sleuth and Why Does It Still Get Used?
- What Does Xenu Actually Find?
- How to Set Up and Run a Xenu Crawl
- Where Xenu Fits in a Modern SEO Audit
- The Limitations You Need to Know Before You Rely on It
- Broken Links and SEO: Why This Still Matters
- How to Prioritise What Xenu Finds
- Xenu vs. Paid Alternatives: An Honest Comparison
- A Note on Tool Dependency in SEO
I’ve run technical audits on sites ranging from a 200-page local business to enterprise e-commerce platforms with millions of URLs. The tools change depending on scale and budget. But Xenu has shown up in my workflow more times than I’d expect for something built before most of my current colleagues finished school. This article covers what it does, where it fits, and how to use it without wasting a morning on a report nobody acts on.
What Is Xenu’s Link Sleuth and Why Does It Still Get Used?
Xenu’s Link Sleuth was created by Tilman Hausherr and released as freeware in 1997. It was designed to crawl a website and flag broken links, a problem that was becoming commercially significant as websites grew beyond a handful of pages. The software is lightweight, installs in seconds, and produces an exportable report of every URL it finds, including status codes, anchor text, and source pages.
The reason it still gets used in 2024 is straightforward: it’s free, it’s fast for small to mid-size sites, and broken link detection hasn’t fundamentally changed as a problem. A 404 is still a 404. A redirect chain is still a redirect chain. The underlying HTTP protocol that Xenu interrogates hasn’t been replaced. So the tool still works, even if the ecosystem around it has moved on considerably.
There’s also something worth acknowledging about the economics of SEO tooling. Not every agency, freelancer, or in-house marketer has budget for Screaming Frog, Ahrefs, Semrush, and Sitebulb simultaneously. When a free tool covers a specific use case adequately, using it isn’t a compromise. It’s just sensible resource allocation. I’ve spent enough time managing agency P&Ls to know that tool subscriptions accumulate quietly and justify themselves poorly when audited.
For a broader view of how technical crawling fits into a complete SEO approach, the SEO strategy hub covers the full picture, from technical foundations to content and authority building.
What Does Xenu Actually Find?
Run a Xenu crawl on a live site and it will return a list of every URL it encounters, sorted by status code. The primary outputs that matter for SEO work are:
- 404 errors: Pages that return a “not found” response. These are the most immediate fix. Internal links pointing to 404s waste crawl equity and create a poor user experience.
- 301 and 302 redirects: Xenu flags these and shows you the destination URL. Redirect chains (301 pointing to another 301) are a common crawl efficiency problem that Xenu surfaces clearly.
- 500 server errors: Less common but more serious. These indicate server-side failures that can prevent pages from being indexed at all.
- Broken external links: Outbound links from your site that point to pages that no longer exist. These matter for user experience and, to a lesser extent, for how Google perceives page quality.
- Orphaned pages: Pages with no inbound internal links. Xenu won’t label them as orphaned explicitly, but cross-referencing its crawl output against your sitemap reveals pages that exist but aren’t linked from anywhere.
What Xenu doesn’t find is equally important to understand. It doesn’t render JavaScript, so any content or links loaded dynamically won’t be crawled. It doesn’t analyse page speed, Core Web Vitals, structured data, or content quality. It doesn’t connect to Google Search Console or any other data source. It is a link crawler, nothing more. Using it as a complete audit tool would be like using a thermometer to diagnose why a car won’t start. Useful for one specific reading, not the whole picture.
How to Set Up and Run a Xenu Crawl
The setup is minimal. Download the installer from the official Xenu site, install it on a Windows machine, and open the application. From the File menu, select “Check URL” and enter the root domain of the site you want to crawl. That’s the core workflow.
Before you start, there are a few configuration decisions worth making:
Set a crawl depth limit. For large sites, an unconstrained crawl can run for hours and produce a report so large it’s difficult to work with. Start with a depth of 3 to 5 levels to get a representative picture without crawling every paginated URL on the site.
Exclude known noisy paths. Most sites have URL patterns that generate hundreds of near-duplicate URLs: filtered product pages, calendar archives, search results pages. Under Options, you can exclude URL patterns using simple string matching. If your site has /wp-content/ or /cdn-cgi/ paths generating large volumes of non-content URLs, exclude them before you start.
Set a crawl delay. By default, Xenu crawls aggressively. On shared hosting or smaller servers, this can cause performance issues during the crawl. Adding a small delay between requests (under Options, “Pause between requests”) reduces the server load. The relationship between web performance and brand perception is well documented, and deliberately hammering your own server to run an audit is counterproductive.
Check external links separately. You can run a crawl that checks external URLs as well as internal ones. This is slower and more resource-intensive, but useful if you’re specifically looking for outbound link rot. Run it as a separate pass rather than combining it with your internal crawl.
When the crawl finishes, export the results as a tab-separated file. Open it in Excel or Google Sheets and filter by status code. The 404 and 5xx rows are your immediate priority. The 301 rows are your redirect audit. Everything else is context.
Where Xenu Fits in a Modern SEO Audit
When I was growing the agency from around 20 people to over 100, one of the disciplines I tried to instil in the SEO team was the difference between generating findings and generating fixes. We had clients who wanted audit reports. What they needed was a prioritised list of things that would actually move the needle, with someone accountable for implementing them. Those are different deliverables.
Xenu produces findings. What you do with them determines whether the exercise has any value. A 200-row spreadsheet of broken links handed to a developer with no context, no prioritisation, and no explanation of business impact will sit in a folder. A short document that says “these 12 broken links are on your highest-traffic pages, here’s the correct destination for each, and here’s why fixing them matters” gets actioned. The tool is the same. The output is completely different.
In terms of where Xenu sits within a broader audit workflow, think of it as the first pass on link integrity. It runs quickly, costs nothing, and gives you a clean list of HTTP-level problems. From there, a more capable tool like Screaming Frog handles JavaScript rendering, on-page analysis, and structured data review. Google Search Console provides the index-level view: which pages Google has actually crawled, which are excluded and why, and where manual actions or coverage issues exist. Xenu doesn’t replace either of these. It complements them by being fast and free for the specific task of link checking.
The Moz Whiteboard Friday on filling SEO skill gaps touches on a related point: knowing which tool answers which question is itself a skill. Reaching for the most expensive tool by default isn’t expertise. It’s just habit.
The Limitations You Need to Know Before You Rely on It
Xenu’s limitations aren’t hidden. They’re just worth being explicit about before you build a workflow around it.
No JavaScript rendering. This is the biggest constraint in a modern web context. A significant portion of websites now load content, navigation, and links via JavaScript. Xenu crawls the raw HTML response from the server. If your site’s internal links are rendered by a JavaScript framework, Xenu won’t see them. For heavily JavaScript-dependent sites, Xenu’s crawl will be materially incomplete. You’ll need a tool that uses a headless browser, such as Screaming Frog with JavaScript rendering enabled, to get an accurate picture.
Windows only. Xenu runs as a native Windows application. If your team works on Macs, it requires either a Windows virtual machine or a separate machine. This is a practical friction point that makes it less accessible for some teams.
No authentication support. If parts of your site are behind a login, Xenu can’t crawl them. For SaaS products, membership sites, or any site with gated content, the crawl will only cover publicly accessible pages.
No integration with other tools. Xenu produces a flat export file. There’s no API, no connection to Google Analytics, no link to Search Console data. Every analysis step happens outside the tool, manually. For teams with established reporting workflows, this adds friction.
No active development. The last official update was in 2010. The tool works on modern Windows versions, but there’s no guarantee of continued compatibility, no support channel, and no feature development. If something breaks in a future Windows update, it breaks permanently.
None of these limitations make Xenu useless. They make it a specialist tool with a defined scope. Use it within that scope and it performs well. Use it outside that scope and you’ll get an incomplete picture and draw the wrong conclusions from it.
Broken Links and SEO: Why This Still Matters
There’s a tendency in SEO to chase the newest signal while underinvesting in the basics. I’ve judged the Effie Awards and reviewed enough marketing effectiveness work to know that the campaigns and programmes that perform consistently well are rarely the ones built on novel tactics. They’re built on fundamentals executed well, with enough discipline to maintain them over time.
Broken internal links are a fundamental. When Googlebot follows a link to a 404, it doesn’t pass PageRank to that destination. If that destination was a high-value page, you’ve lost an internal equity signal you were probably relying on. More practically, a user who clicks a broken link has a worse experience and is less likely to convert. The SEO and UX cases for fixing broken links are aligned, which is relatively rare and worth acting on.
Redirect chains deserve equal attention. A page that has been moved twice will have a chain: original URL redirects to interim URL, which redirects to final URL. Each hop in that chain adds latency and dilutes the equity passed through the redirect. Consolidating chains to direct 301s is a low-effort, measurable improvement. Xenu surfaces these chains clearly, showing you the full redirect path for each URL it encounters.
The historical context of how search engines have handled crawling is a useful reminder that crawl efficiency has always mattered. The mechanics have evolved, but the principle that search engines have finite crawl budgets and allocate them based on site quality signals has been consistent. A site with clean link architecture gets crawled more efficiently than one with hundreds of broken links and redirect chains.
How to Prioritise What Xenu Finds
Not every broken link is equal. A 404 on a page that gets no traffic and has no inbound links is a low priority. A 404 on a page linked from your homepage, with inbound links from three authoritative external domains, is a high priority. Xenu doesn’t know the difference. You need to layer in traffic and link data to prioritise correctly.
The workflow I use is straightforward. Export the Xenu results. Filter for 404 status codes. Take the list of source pages (the pages containing the broken links) and cross-reference them against Google Analytics or Search Console to identify which source pages have meaningful traffic. Fix broken links on high-traffic pages first. Then work through the remainder by traffic volume.
For redirect chains, the prioritisation logic is similar. Cross-reference the redirecting URLs against your link profile data (from Ahrefs or Semrush) to identify which redirecting URLs have inbound external links. Those are the ones where consolidating the chain has the most direct equity impact.
One practical note: when you’re fixing broken internal links, update the link at the source rather than just adding another redirect. If a page has moved permanently and you’re updating internal links to point to the new URL, point directly to the destination. Don’t create a new 301 to the existing 301. Clean link architecture means links point to their final destination wherever possible.
If you’re building or refining a broader SEO workflow and want to see how technical auditing connects to content strategy, link building, and measurement, the complete SEO strategy guide covers each of these areas in depth.
Xenu vs. Paid Alternatives: An Honest Comparison
The honest comparison isn’t “Xenu vs. Screaming Frog.” It’s “what does each tool do, and which one answers the question I’m trying to answer right now.”
Screaming Frog is the professional standard for desktop crawling. It renders JavaScript, analyses on-page elements, integrates with Google Analytics and Search Console, generates visualisations, and handles sites with millions of URLs. The free version is limited to 500 URLs. The paid version is around £200 per year. For anyone doing SEO professionally, it’s worth the cost.
Sitebulb is a strong alternative with better visualisation and a more accessible interface for presenting findings to clients. The Moz Whiteboard Friday on presenting SEO projects makes a point about the gap between what SEOs find technically interesting and what clients actually need to see. Sitebulb closes that gap better than most tools.
For cloud-based crawling, Semrush’s Site Audit and Ahrefs’ Site Audit both run without requiring a local install, store historical crawl data, and integrate with the rest of their respective platforms. If you’re already paying for either platform, the site audit functionality largely replaces Xenu’s use case.
Where Xenu still wins: it’s free, it installs in under a minute, it requires no account, no subscription, and no configuration beyond entering a URL. For a quick diagnostic crawl on a new client site before a pitch, or for a small business site where budget doesn’t support paid tooling, it does the job. The output is less polished and the analysis requires more manual work, but the underlying data is sound for the specific task of link checking.
A Note on Tool Dependency in SEO
I’ve watched the SEO industry develop a complicated relationship with its tools. There’s a version of SEO practice that has become almost entirely tool-dependent: run the audit, export the recommendations, present the red-amber-green dashboard, repeat. The thinking happens inside the software. The practitioner becomes an operator of tools rather than a solver of problems.
This matters because tools have blind spots, and if you’re not thinking independently of your tools, you inherit those blind spots without knowing it. Xenu doesn’t render JavaScript, so a practitioner who relies solely on Xenu will consistently underreport link issues on modern sites. Screaming Frog’s default configuration doesn’t crawl certain URL parameters, so a practitioner who runs it without reviewing the configuration will miss a category of duplicate content issues. Every tool has these gaps.
The most valuable SEO skill isn’t knowing how to use any particular tool. It’s knowing what questions to ask, which tool answers each question most reliably, and what the tool can’t tell you. Xenu is a useful data point in that framework. It’s not a substitute for it.
When I was managing large-scale paid media alongside SEO, one of the disciplines that carried across both was this: understand what your measurement is actually measuring before you trust it. Xenu measures HTTP responses to crawl requests. That’s a specific, limited, useful signal. Treat it as exactly that.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
