Case studies cut through theory. You can read a hundred blog posts on keyword clustering and still not know whether the approach will move the needle for a real site with real commercial pressure. This post walks through an anonymized engagement with a mid-market B2B SaaS company — what the site looked like at kickoff, the exact clustering workflow we applied to 4,200 keywords, the publishing cadence we committed to, and the traffic and pipeline numbers nine months later.
Names and precise figures are masked to respect the client's NDA, but the methodology, structure, and ratios are intact. If you operate a content-led acquisition site, you should be able to map this playbook onto your own program with minor adjustments.
The Starting Point: A Fragmented Content Library
The client — call them "OpsCue" — sold workflow automation software into operations teams at mid-market companies. At the time we were engaged, the marketing team had been producing content for roughly three years. Their blog had 182 published posts, the site had DR 54, and organic traffic sat at approximately 18,000 sessions per month. Monthly organic signups hovered around 140.
The surface-level metrics looked healthy, but three underlying issues were starving the site of growth.
1. Keyword Cannibalization at Scale
When we ran an audit, 31% of their published URLs were competing with at least one other internal page for the same head term. Several high-volume queries had five or six candidate pages, none of which ranked above position 14. Google was quietly rotating URLs in and out of the index because it couldn't decide which page was canonical.
2. No Topical Containers
Posts were organized chronologically, not topically. A reader landing on an article about "approval workflows" had no obvious path to related content. There were no hub pages, no pillar articles, and no internal links to weave sibling posts together. From Google's perspective, there was no evidence the site covered any topic in depth.
3. Misaligned Intent
Roughly 40% of published posts targeted keywords whose SERPs were dominated by listicles or comparison pages, but the content itself was written as a traditional how-to. The format mismatch was enough to keep most of those URLs pinned below the fold in SERPs regardless of authority signals.
Key insight: A fragmented content library doesn't fail because of any single issue — it fails because cannibalization, weak topical signals, and intent mismatch compound. Fixing one without the others usually produces disappointing results. Clustering addresses all three simultaneously.
Phase 1: Keyword Discovery and Cluster Generation
We started with a raw keyword list of 4,200 terms pulled from three sources: the site's existing Search Console impressions (down to position 80), an Ahrefs keyword export seeded with their 25 highest-performing URLs, and a competitor gap analysis against four direct competitors. After deduplication and a minimum-volume filter of 30 monthly searches, we had 3,614 unique keywords to work with.
Clustering was run using SERP overlap as the similarity signal rather than semantic similarity alone. The reasoning was practical: two keywords that share five or more top-10 results almost always belong on the same page in Google's eyes, regardless of how different the strings look. Semantic similarity produces logically clean clusters that don't always match how Google serves results.
We used a threshold of four shared URLs in the top 10 as the minimum for keywords to be grouped. After the pass, the 3,614 keywords collapsed into 284 distinct clusters. The distribution was long-tailed: the largest cluster held 87 keywords while 61 clusters contained a single keyword each.
Cluster Prioritization
We scored each cluster on four dimensions and funneled them into three tiers.
The four scoring dimensions were total search volume across the cluster, a weighted intent score (commercial intent counted 3x informational), a difficulty score (average KD of the top 3 SERP positions), and a business-fit score assessed by the client's product marketing team on a 1-5 scale.
Tier 1 clusters — 34 of them — scored high on all four dimensions and became the foundation of the content calendar. Tier 2 clusters — 89 of them — had strong volume or intent but weaker difficulty or fit, and were queued for months 4-9. Tier 3 clusters were deferred or merged into broader parents.
Phase 2: Site Restructure Before New Content
Before publishing a single new article, we spent six weeks restructuring the existing library. This sequencing matters: if you pour new posts onto a cannibalized foundation, you multiply the problem instead of solving it.
Consolidation of Cannibalized URLs
We identified 47 clusters where multiple existing URLs were competing. For each, we picked the strongest URL as the survivor (based on backlinks, engagement, and current ranking), merged the unique content from the other URLs into it, and 301-redirected the losers to the survivor. This collapsed 112 URLs into 47.
Internal Linking Rebuild
We built a spreadsheet mapping every surviving URL to its target cluster. Every article within a cluster was required to link to the designated pillar page using the cluster's primary keyword as anchor text. Every pillar page linked down to every cluster member using descriptive anchors. Cross-cluster links were added only where topical overlap genuinely existed — no random "related posts" widgets.
Intent Realignment
For the 40% of posts with intent mismatch, we either rewrote them to match the dominant SERP format or, if the business case was weak, redirected them to closer relatives. Eleven articles were rewritten from how-to into listicle format during this phase.
Start Clustering Keywords Today — Risk Free
Plans start at just $19. If KeyClusters doesn't deliver, we'll refund you — no questions asked. No subscription required.
Get Started — From $19Phase 3: The Content Calendar
From month two through month nine, the team published to a cluster-first calendar. Each Tier 1 cluster received one pillar page and between three and seven supporting articles, all planned as a unit before drafting began.
The cadence was three articles per week — one pillar piece every two weeks and the rest spent on supporting content. Over the eight-month production window, they published 94 new articles covering 31 of the 34 Tier 1 clusters. The remaining three clusters were deferred because the client hadn't yet built the product features needed to back the claims.
Article Briefs Built From the Cluster, Not the Keyword
Every brief started at the cluster level. Writers were given the full list of keywords assigned to the piece, the SERP landscape, the intended pillar or supporting role, and internal link targets in both directions. This meant no single article ever had to cover the entire cluster — it just had to cover its slice cleanly and pass readers along the right pathways.
Monthly Cluster Reviews
At the end of each month, we reviewed cluster-level performance rather than URL-level performance. The question wasn't "how is this post ranking?" but "is the cluster as a whole converting impressions into clicks at a reasonable rate?" Two clusters were restructured mid-engagement based on these reviews — one because an unexpected competitor published a much better pillar piece, and one because the client's ICP shifted and new keywords needed absorbing.
Results at the Nine-Month Mark
The numbers below are rounded and scaled by a constant factor agreed with the client, but the ratios are accurate to within a few percent.
Organic Traffic
Monthly organic sessions grew from approximately 18,000 at kickoff to 54,500 by month nine — a 3.03x increase. Growth was not linear. The first two months showed a small dip (down about 8%) as consolidation redirects settled and Google reprocessed the site. Month three showed the first net-positive movement, and growth compounded from month five onward as newly published clusters matured.
Rankings
The number of keywords ranking in the top 10 grew from 412 to 1,847 — a 4.48x increase. More importantly, top-3 rankings grew from 71 to 394, which correlates far more tightly with traffic than top-10 rankings do. The survivor URLs from the consolidation phase gained an average of 7.2 positions each.
Conversions
Monthly organic signups grew from 140 to 486 — a 3.47x increase, slightly outpacing traffic growth. The overperformance came from intent realignment: commercial-intent clusters that had been under-served at kickoff now captured buyers earlier in the funnel. Pipeline attributed to organic (by the client's multi-touch model) grew roughly 3.9x.
The pattern to notice: Conversions grew faster than traffic, and top-3 rankings grew faster than top-10 rankings. Clustering doesn't just bring more people in — it brings better-matched people in and ranks the pages that actually convert. Generic "more content" strategies tend to invert this ratio.
What Didn't Work
Honest case studies include the misses. Three things didn't land the way we expected.
First, we over-invested in one cluster around a buzzword that lost relevance mid-engagement. Seven articles targeting it produced negligible traffic. The lesson was to weight business-fit scoring more heavily for clusters tied to trend-dependent terminology.
Second, the pillar page format we initially shipped was too long — 5,800-word monoliths that performed worse than 2,400-word versions we tested in month six. Comprehensiveness at the cluster level doesn't mean length at the URL level.
Third, we underestimated how much ranking velocity would come from consolidation versus new publishing. Roughly 60% of the traffic gain in the first four months came from existing URLs improving after consolidation and re-linking, not from new posts. If we ran the engagement again, we'd spend even longer on structural cleanup before opening the publishing taps.
Can You Replicate This?
The methodology is replicable on any site that meets a few conditions. You need at least 40 or so existing URLs to make consolidation worthwhile — smaller sites can skip phase two and move straight into cluster-first publishing. You need a keyword pool large enough to produce meaningful clusters — typically 500+ terms. And you need publishing capacity of at least one piece per week, because clustering's compounding benefits show up over quarters, not weeks.
What you don't need: a massive budget, an in-house SEO team, or DR 70+ authority. OpsCue's domain authority barely moved during the engagement; the growth came from structural decisions, not link-building.
Where Most Teams Stall
Three things stall most attempts to run this playbook internally. Teams skip the consolidation phase because it feels like going backwards. They cluster keywords with semantic tools that don't reflect SERP reality. And they publish to a calendar that's optimized for volume rather than cluster completeness, leaving most clusters with two or three articles when they needed six.
A good clustering tool removes the second of those three obstacles almost entirely. The other two are process decisions, but they become easier to hold the line on when the underlying cluster data is trustworthy.
No Risk: Money-Back Guarantee on Every Plan
Turn your keyword list into SERP-validated clusters in minutes. Plans start at $19 with a full money-back guarantee — no subscription, no lock-in.
Try KeyClusters Risk-Free