Skip to content

Crawl Budget

I can't remember when the term "long tail" first appeared.

Not the bestselling few but the aggregate of niches accounting for the majority. More than half of Amazon's revenue comes not from bestsellers but from products almost nobody knows. When this idea merged with SEO, search changed. Stop fighting over head terms. Gather the countless small ones instead.

SEO back then was shady. Hidden text, keyword stuffing, link farms. There was even cloaking—serving one page to Google and a different one to humans. It was a penalty offense, but agencies kept doing it. As long as ranking converted directly to money, the incentive to push grey areas never went away.

Times changed. Google stopped treating SEO as the enemy. Instead it began encouraging optimization so crawlers could read content correctly. Structured data, meta tags, sitemaps. Techniques once branded as spam were now endorsed as legitimate. Shaping content for search engines became a best practice, not a violation.

Then SPAs complicated things. There was an era when Googlebot couldn't properly interpret JavaScript-rendered pages. The DOM assembled on the client side was invisible to crawlers. Pages you built simply didn't appear in results. A technique called Dynamic Rendering emerged—returning different HTML exclusively to crawlers. But that meant doing invisible work for the crawler that humans never see. In substance, it was no different from the cloaking that used to get you penalized.

When you think about it, SEO is the act of optimizing for one company's algorithm. When Google changes the rules, everyone follows. Rankings drop, revenue vanishes. It's your site, but the real owner is Google.

We write code for crawlers. We just call it "optimizing customer acquisition."