The Problem With Buying Someone Else's Lead Gen Stack

A hand-drawn editorial illustration of a large cobbled-together mechanical machine with funnels, gears, and pipes, swallowing envelopes at the top and shooting them out rapidly at the bottom, drawn in loose sepia ink on a warm yellow background with amber accents.

Taylor Haren runs a cold email agency. Not a small one. At peak, his agency Fixer AI was processing 9 million leads a month. He was also, at one point, Clay's largest user - hitting their platform 17.3 million times per week.

Then he hit the ceiling. Clay caps you at 50,000 rows per table and 12.5 million rows across your entire workspace. When you're processing millions of leads, you're clicking "run all" thousands of times and waiting days for deleted tables to actually clear. That's not a workflow problem. That's a platform problem.

So Taylor and his right-hand guy James did something most agency owners never attempt: they built their own system from scratch. Neither of them can write a single line of code. James had never touched any AI coding tools before. Three weeks after picking up Claude Code, he had built the core architecture.

Now they process 272,000 leads per second. One million leads in 5 seconds. Their old setup on Clay took 27 hours for that same million - and that's if it didn't error out, which it often did.

The Stack: What They Built and What It Costs

The entire system runs on four tools. Here's what each one does:

Total monthly infrastructure cost: around $2,000. Before this, Taylor spent $3,000 in a single month just on Cursor (a competing AI coding tool) and was burning $450 per day when he and James were both using it simultaneously.

For an agency processing millions of leads, $2,000 per month in infrastructure is essentially rounding error.

The Specific Tools They Built

The infrastructure is just the foundation. On top of it, Taylor and James built several custom tools that would normally require a full engineering team. Here's what they actually built:

Google Maps Scraper

The trick with Google Maps lead scraping is to query by zip code, not by city or state. At the city level, you hit result caps and miss leads. There are over 32,000 zip codes in the US, so running zip-by-zip gives you a clean, complete result set for every single query.

James built this in Cursor in three to four hours. It pulls local businesses, then runs an AI enrichment layer on top to find contacts at each company. That enrichment layer scrapes any public database it can find. A second AI layer then segments the results further - things like whether the business is a multi-location practice, how long they've been operating, confidence scores on each data point.

The cost: one-fifth of a penny per company. That's $0.002 to find three qualified leads at a business. At scale, that's almost nothing.

AI Lead Finder

After the Google Maps scrape worked so well, they plugged the same AI enrichment layer into their main lead pipeline. Their primary data vendor is now Arc (which has largely replaced Apollo for them). When Arc doesn't have a contact match for a company, the AI goes searching.

It searches the entire public internet - LinkedIn profiles, company sites, press mentions - and always tries to return three contacts per company with a confidence score and reasoning for why each person is a good fit. If it finds someone but can't locate a valid email, it looks for alternate emails: personal accounts, emails tied to other roles that person holds, consulting arrangements, partnerships.

The result: where a standard data source like Apollo or LinkedIn might return valid, deliverable emails for 30% of a list, their system gets above 95%. That's not a small lift. That's the difference between a campaign that moves and one that dies in a spreadsheet.

Ad Library Scrapers

They built scrapers for both the Google Ads library and the LinkedIn Ads library. The logic is simple: if a company is actively running ads, they have a budget allocated for client acquisition. They're in growth mode. They're a warmer lead than someone who isn't spending anything on marketing.

The scraper lets them filter by recency (ads running in the last 30 or 60 days) and by volume (how many active ads the company has). The result is a pre-qualified list of companies that are already in buying mode - before Taylor's team has sent a single email.

If you want to experiment with building something similar at a smaller scale, tools like ScraperCity can help you pull lead data from public sources without building custom infrastructure.

Executive Summary and Campaign Analysis System

This one is less about finding leads and more about running a smarter agency. The system pulls data from all active campaigns daily and generates a report covering 15 randomly selected clients, email-to-lead ratios, meetings generated, and campaign performance trends.

But the deeper goal is copy analysis. The system takes every email template, runs it against a schema, and categorizes every element: subject line type, hook style, CTA format, body messaging category, social proof category, and ICP segment. Over time, it builds a dataset that can answer questions like: what copy elements produce the highest reply rates when you're targeting the director of marketing at a paper manufacturer?

That kind of analysis is what separates agencies that scale from agencies that plateau. Most cold email shops are guessing. Taylor's team is building a feedback loop that gets smarter with every send.

Need Leads for Your Business?

Search B2B contacts by title, industry, location, and company size. Export to CSV instantly.

Try ScraperCity Free →

How to Steal This Playbook Without Building Everything Yourself

You don't need to process 5 million leads a month to benefit from this approach. Here's what actually translates to a smaller operation:

The Bigger Picture

What Taylor built isn't just a faster lead gen system. It's a competitive moat. When a campaign isn't working, he can pivot, reload a new list, and start sending again in minutes. His competitors are waiting 27 hours for the same operation - and that's if their platform doesn't error out mid-run.

Speed compounds in cold email. The faster you can test a new list, a new angle, or a new ICP, the faster you find what actually works. Agencies that are bottlenecked by their tools are always running a campaign that's weeks behind their best insight.

The other thing worth noting is the cost structure. Clay's pricing was heading in a direction that would have made Taylor's volume uneconomical. By owning the infrastructure, he fixed his cost at roughly $2,000 per month regardless of how many leads he processes. That's the kind of leverage that lets an agency scale revenue without scaling costs.

You probably aren't processing 9 million leads a month. But the principles here - build what you can't buy, stack your data sources, analyze your campaigns with the same rigor you apply to client work - apply at any volume. The tools just got cheap enough that almost anyone can start.

Need Leads for Your Business?

Search B2B contacts by title, industry, location, and company size. Export to CSV instantly.

Try ScraperCity Free →