The Weekend Leverage
Nvidia’s revenue polycule, hire annoying people, and an essay on crustaceans
Lace up your New Balances and throw on your jean shorts, it’s Father’s Day! Hopefully this email finds you surrounded by family. I love this day because I quite literally put on the jorts my wife hates, and then, sufficiently squeezed into poorly cut denim, rocking a 7-inch inseam, feel an immense amount of gratitude for my little one and the life she gives me.
But enough of that sappy stuff—on to the important things: tech. This week was a banner week for The Leverage with MRR up 15% and essays that subscribers loved. It was also a week of fascinating moves, with the most expensive acqui-hire of all time and a brand-new model from OpenAI.
We’ll get to that. But first, today’s issue is brought to you by Mercury.
Most fintech content is about how to go to Coachella using credit card points. Mercury’s content is about how to build a generational company, with sessions from senior leadership from companies like OpenAI. This says everything about them as a product. They’re meant for founders, and focus on all the things a CEO needs. From advice on term sheets to cap tables, they have the resources a technologist needs. If you want banking software that's meant for startups, check out Mercury here.
Mercury is a financial technology company, not a bank. Banking services provided through Choice Financial Group, Column N.A., and Evolve Bank & Trust; Members FDIC.
MY RESEARCH
Hire misfits, not missionaries. For the past two decades, startups were supposed to hire missionaries, people who really believed in the mission of a company. Take it from a literal former missionary, these people are no longer up to the task of building a startup. The AI paradigm-shift rewards a temperament and mindset diametrically opposed to the typical missionary. Instead, you need misfits.
AI is popular because having a job sucks ass. This title is a little dumb, but you kinda agree, don’t ya? Having a job is tedious for everyone involved. CEOs are soft-firing by mandating return to office, and 70% of middle-managers would happily go back to not having direct reports if they could keep the same salary. And staggeringly, for 40% of Americans, the best kind of job is one where they are paid to do nothing at all. In this environment, it’s obvious why AI is popular—we are all simply trying to escape the tedium of our day jobs. In this article I talk about why and how I use AI to get much more done than I used be able to. This essay has been much more popular than I thought it would be! Check it out.
THE BIG STORIES
The McKinsey problem of OpenAI’s new model. OpenAI released a new model, titled o3-pro. First, their marketing and naming conventions continue to be terrible. Second, the benchmarks are kinda…unremarkable? You would hope for larger, more obvious gains in capabilities. However, my testing has aligned with other commentators: The value is in the context you give the model and the tools it employs. If you give it simple prompts, the results are unremarkable. If you give it a big dump of documents and ask it to perform more abstract, strategic thinking, it crushes. The results are convincing, exhaustive, and smart.
However, this brings up a large problem—is it actually correct?
Previous generations of models were measured using academic tests, problem sets with clear right and wrong answers. These future-generation models will need to be tested on their ability to impact real-world outcomes. This is kinda, sorta impossible? The world has so many variables outside the machine's control and context that you can’t really run a study on the basis of real-world outcomes.
I call it the McKinsey problem: If you gave five strategy consulting firms the same data and each one’s work was equally convincing, which firm are you going to trust? You’ll pick McKinsey because the brand broadcasts that it is most likely to be the most accurate. So too, with AI. If you gave me a model from OpenAI, I’ll trust it more than some random open-source alternative, even if the answers are equally compelling. After all, it’s McKinsey!
$15 billion for a Wang and a dream: In the most expensive acqui-hire of all time, multiple news sources reported that Meta was purchasing a 49% stake in Scale AI for $14.8 billion. The deal appears to have elements of the Google and Character.ai transaction where both companies continue operating, but the CEO of the startup makes the jump to big tech. Scale is losing its founder, Alexandr Wang, who will be leading a new superintelligence group at Meta. The New York Times reported that offers for the team include “seven- to nine-figure compensation packages to dozens of researchers from leading A.I. companies.” (Are they hiring newsletter writers?)
Meta company is expected to spend $64-72 billion on capital expenditures this year, with over 90% of that going that toward AI data-center infrastructure. Another $15 billion to get the guy you think can deliver on superintelligence? It doesn’t feel crazy when framed like that. Perhaps the craziest part—Wang is only 28 years old.
The IPO market may be open again? Maybe? Chime debuted this week up 59% in the first day of trading for a valuation around $18.4 billion. There has been a longtime drought of new public companies, but between Chime and Coreweave (more on that in a sec), there seems to be an appetite for new tech stocks with a story to sell. To be fair, they were last valued at $25 billion in 2021, so companies may have to take a haircut, but more companies should be publicly traded.
VISUAL SIGNAL
AI revenue is looking like a SF Hacker House (polyculesque). One of the strongest critiques of the AI sector right now is how fuzzy the boundaries between firms are. Take for example, data-center provider Coreweave. The company offers data centers stuffed full of Nvidia GPUs for firms building AI to rent. This week OpenAI announced they would buy computing capacity from Google. Google then swiftly announced they would take some of that capacity from Coreweave. The weird thing is that OpenAI is already a Coreweave customer and investor? Plus, all of them are using Nvidia chips! I put together the total amount of commitments and capabilities being sold into the chart below.
To make it even weirder, this is only for GPU capacity. Some of it will be for training their own models, and some will be for inference that research labs are giving to startups for free to incentivize them to use its models. You get the same piece of computing counted as revenue multiple times, inflating the equity value of everyone involved.
In an Invest Like the Best podcast episode, former Benchmark partner Bill Gurley described the dynamic like this:
“There’s some chance…that a lot of the revenue growth we’re seeing is actually just the resale of compute. Many of the players in the market are reselling a wrapper on top of a foundation model, which itself sits on top of a hosting service. And a lot of those wrapper companies, people believe, are running at negative gross margin.
So you might, in buying something from a wrapper company, be getting compute cheaper than you would have got it from the model company, which in turn is getting it cheaper from the host. That same dollar of compute revenue is effectively being counted three or four times up the chain.”
I haven’t found any data that shows how strong this effect is, but I’m hunting for it. (If you have some insight on the topic, drop me a line or just respond to this email.) If this revenue triple count trend is significant, we may be going through the mother of all bubbles.
TASTEMAKER
Which software companies win or die with AI?: Whether a company becomes the “system of action” is an emerging narrative to dig into. It is an evolution beyond a database that holds the crucial information, it is the software where people actually do stuff with that data, using LLMs. This essay from Dave Yuan, the founder of growth-equity firm Tidemark (disclosure: I did some freelance writing with the fund in the past) lays out the argument for why this is the case. I think Dave is one of the smartest thinkers we have in software, so well worth checking out.
Consider the Lobster: Anyone following AI is ruminating on the nature of intelligence, consciousness, and thought. This essay does the same except it is about the 2003 Maine Lobster Festival. It is impeccable writing from the late David Foster Wallace and has his typically delightful, crackling prose. Warning—it has made me incapable of eating lobster and most shellfish in the five years since I read it for the first time. Highly recommended Sunday read.
Carrie & Lowell (10th anniversary): How do we handle the relationship between an artist’s emotions and their eventual embarrassment about them? Sujfan Stevens is one of the greatest artists of the last 50 years—his tender voice and thoughtful arrangements followed me from the first day I got an iPod. In Carrie & Lowell, one of his most personal works, he grapples with the death of his mother in specific, painful lyrics. It is incredible. The notoriously hard to please Pitchfork gave the 10th anniversary reissue a very, very rare 10 out of 10. (Even if you don’t listen to this album, read the review, it is heartbreakingly beautiful and I cried.) What makes C&L fascinating is that Stevens is now embarrassed by it. He is embarrassed that his grief is beloved by millions. I think about this a lot with my own work. Will I one day be ashamed of what I wrote? Will sharing my thoughts and feelings with all of you come back to haunt me? I don’t know. But I do know that listening to this album helped me find peace in my choices.
Until next week, my friends. Paid subscribers to The Leverage can look forward to two paywalled essays this week containing my best and most in-depth research. You can upgrade below to get them in your inbox.