The Leverage

The Leverage

The Case for Spending Seven Trillion on AI

Yes, it sounds insane. Here’s why OpenAI might be right.

Evan Armstrong's avatar
Evan Armstrong
Sep 30, 2025
∙ Paid
10
Share

“If we end up misspending a couple of hundred billion dollars, I think that will be very unfortunate obviously, but I actually think the risk is higher on the other side.” [Emphasis added]

This quote from a recent Mark Zuckerberg interview made my fingers tingle and my 401K shiver with fear. It is so wildly aggressive, so bubble-esque sounding, that it feels like it would be more at home in a parody of Silicon Valley than a press appearance. Allow me to reiterate, he wants to spend a couple HUNDRED BILLION dollars. This is a ludicrous sum to throw down over just four years. Meta’s trailing twelve months revenue was $178 billion, a far cry short of what they are hoping to spend.

Even crazier, Meta isn’t the only firm planning on this level of cash incineration. OpenAI’s internal plans call for them to build 250 gigawatts worth of data centers by 2033. By my own napkin math, and the analysis of other firms, it costs roughly $30 billion per gigawatt of datacenter with $10 billion allocated for building and $20 billion for the chips. But I may be too generous. One source at OpenAI recently told Bloomberg that it now costs $50 billion per gigawatt.

This means that you are looking at a total spend well north of seven trillion dollars just to build the facilities to run AI. And then if you assume that OpenAI accomplishes their goal of building 7 trillion dollars worth of data centers, and that they want to maintain a 50% gross margin on top of their datacenter spend, it is implying 14 trillion dollars in revenue. In the last twelve months, Apple, Tesla, Nvidia, Amazon, Google, Meta, and Microsoft did $2.17 trillion in combined revenue. So essentially, Sam Altman is proposing that in the next eight years OpenAI revenue will surpass the entirety of big tech.

Ah! That sure seems like a lot!

Don’t forget that Elon Musk’s XAI, Google, and Amazon all have large data center projects with at least half of the same levels of ambition. They are each attempting a data center buildout that could easily reach hundreds of billions, if not trillions in spend. If all these plans come to fruition, then it is entirely possible that we have north of 20 trillion dollars of data centers built over the next 8 years. AHHH. Sure seems like a lot!

The temptation here, as an internet writer, is to be divisive, call everyone involved a bubble-headed fool, and collect my paycheck as readers reward me for making fun of Mark Zuckerberg, the Lizard King.

Instead, I’d like to explore the opposite, more challenging view. What if our alien overlords are right that we need to spend trillions on data centers? What would need to happen to justify this level of infrastructure investment? It is almost certain there will be a colossal amount of waste throughout this buildout, but I think the upside case is worth taking seriously.

Where we are

At the risk of being overbearing with the math, let’s start with the numbers: The revenue justification for all of this madness is about $20 billion in annualized recurring revenue (ARR) from startups selling AI applications. According to recent analysis from The Information, only 13 startups had scaled past $100 million in ARR including OpenAI and Anthropic. Now, you can quibble with some of their definitions (and there are highly-scaled AI startups I know of that they missed in their tally). But that still feels like a small amount of revenue to justify drawing up plans to spend trillions.

However, growth hasn’t only been limited to new startups. I know of many more traditional software startups that have doubled their ARR and improved growth metrics by incorporating AI over the last few years. For example, Notion now has an over 50% attach rate for their AI products. As such, let’s be ludicrously generous and 3x the amount of total startup ARR to $60 billion. That still feels too small to justify trillions in spend.

But keep in mind that this has all happened in less than three years. The internet took longer to catch on. Smartphones took much longer. There’s been no product like this with such universal demand from consumers and enterprises alike. AI is the fastest revenue-generating technology in history and it doesn’t show signs of stopping. The year-over-year growth rate is so astronomical that you could handwave and make the case to spend mucho dollars just on the basis of software revenues. But let’s go even further!

Revenue potential

Since ChatGPT is kinda like if Google Search and Google Workplace had a baby, it’s useful to use the latter two as comparisons for a revenue exercise.

Some pricing comparisons:

  • Workspace costs anywhere from about $6 to $25 a month depending on features.

  • Average conversion ratios for freemium pricing models is about 5% for 2025.

  • Google Ads monetizes U.S. users at about $19 a month.

  • ChatGPT can cost $20 to $200 a month, depending on the tier.

Still with me? Ok let’s do even more napkin math:

  • Let’s say ChatGPT reaches three billion users in eight years, making it one of the largest applications in history.

  • If you assume all of those free users monetize at the same rate as Google’s U.S. users, (they won’t, but let’s be optimistic) we’ll have $649.8 billion in ad revenue.

  • Let’s set a blended average revenue per paid user at $1000 a year, which means you’ll have roughly $150 billion in paid subscription revenue.

So with very, very, very handwavy and generous math you can make the case for OpenAI in its current form to generate $800 billion in revenue by 2033. That is, uh, a little short of what you would need to justify this $7 trillion data center build out. Plus, these assumptions I have made are wildly optimistic! A more realistic case is ChatGPT ad revenue being closer to Meta’s current average revenue per person of $13.65 and the subscription average revenue being much closer to the $20 per month anchor price giving you maybe $400-500 billion in revenue.

So that means that OpenAI is betting the entire company on not only ChatGPT becoming one of the most successful applications in history but on building at least 1-2 business lines of equal size. This is a demonstrably nutty level of ambition, but damn it, I can’t help but admire it.

The only thing that would be nuttier is believing the OpenAI could pull it off. Perhaps it is my inner Mr. Peanut, but I am one of those fools. I think they can do it. Here is how it will work:

Keep reading with a 7-day free trial

Subscribe to The Leverage to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Evan Armstrong
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture