Dario Amodei has agreed to pay Google about $40 billion a year for five years, starting in 2027. The Information broke the number on May 5, citing four people briefed on the deal. Anthropic, the company Amodei runs, posted a $30 billion annualized revenue run rate in April. The compute bill alone, year one, is larger than the company's current revenue line.
That is not a typo. It is the central fact of the AI cycle in May 2026.
Anthropic has committed roughly $200 billion to Google Cloud and Google's Tensor Processing Units across a five-year window, securing 5 gigawatts of capacity built on next-generation TPUs co-designed with Broadcom. Alphabet booked the agreement into its long-term backlog, which doubled in a single quarter to more than $460 billion. Anthropic now represents over 40 percent of that figure, according to CNBC, which cited Alphabet's own commentary on the Q1 2026 earnings call. The Information reported pricing roughly 40 to 50 percent below comparable Nvidia GPU configurations.
The Anthropic deal is the largest single private compute contract ever signed. It is also the cleanest example of how AI infrastructure economics now work: a model lab raises tens of billions from a hyperscaler, then commits multiples of that capital right back to the same hyperscaler in long-term spend. Alphabet has invested up to $10 billion in Anthropic, with the option to add another $30 billion against performance targets, per filings. Anthropic now owes Google five times the upper bound of that investment, paid in compute.
Alphabet's market cap closed Friday at $4.8 trillion. Nvidia closed at $5.2 trillion. The spread is the narrowest it has been since Nvidia took the top spot. Prediction markets, per Kalshi, now give Alphabet 29.5 percent odds of finishing 2026 as the world's largest company by market value, up from 23.5 percent on May 7.
The Numbers
Start with Anthropic's revenue trajectory. The company has published or confirmed the following run-rate milestones across the last 24 months, according to VentureBeat and Sacra: $87 million in January 2024, $1 billion in December 2024, $9 billion at the end of 2025, $14 billion in February 2026, $19 billion in March, and $30 billion in April. Amodei described the curve as "80x annualized" in his Q1 remarks, per VentureBeat's reporting.
Annual revenue at the current run rate sits at $30 billion. The Google contract demands $40 billion a year in compute payments, on average, beginning in 2027. To meet that bill without burning the rest of its balance sheet, Anthropic needs to triple again. That is on top of the $30 billion Series G it closed in February at a $380 billion post-money valuation, and on top of preemptive offers reported by TechCrunch on April 29 for a fresh $50 billion round at a $900 billion valuation.
Now the other side of the trade. Alphabet's Q1 2026 revenue came in at $109.9 billion, up 21.8 percent year on year and 2.67 percent ahead of consensus, per the company's earnings release. Google Cloud's operating margin tripled in twelve months, moving from 9.4 percent in Q1 2025 to 32.9 percent. The 160 percent rally in Alphabet shares over the past twelve months is built on three things: TPUs at scale, Gemini's improving benchmarks, and now a backlog that suddenly resembles Microsoft's and Amazon's combined cloud forwards.
The math on Alphabet's $460 billion backlog is worth pausing on. The Information reported that the combined cloud contracts of Anthropic and OpenAI now account for nearly half of the roughly $2 trillion in long-term revenue backlog held by the four major cloud providers: AWS, Azure, Google Cloud, and Oracle. Two private companies, neither profitable, anchor the visible forward book of US cloud infrastructure.
For context, Nvidia's most recent quarter delivered $68.1 billion in revenue, up 73 percent year on year, with $62.3 billion of that from data center, per the company's fiscal Q4 release. Nvidia next reports on May 20 with $78 billion guided for the April quarter. Jensen Huang's company still sets the cost ceiling for AI compute. Google has just used Anthropic to call the bluff.
Pressure Points
1. The 5x asymmetry on capital
Alphabet's check into Anthropic, even at the upper $40 billion ceiling, is one fifth of what Anthropic has now agreed to pay Alphabet back. The previous record for this kind of imbalance was Microsoft and OpenAI, where Microsoft put in $13 billion against compute spend that has since blown past that figure but is harder to pin down because the contracts are bilateral and partially renegotiated. Here the public numbers are clean: $40 billion in, $200 billion back out.
That asymmetry only works if Anthropic's revenue keeps compounding. The Series H rumored at $900 billion implies investors will keep funding the gap between revenue and compute spend through 2027 and 2028. If the gap closes naturally because Claude monetization keeps doubling, the structure becomes a virtuous loop. If revenue stalls, Anthropic becomes the largest accounts payable problem in cloud history.
2. Power, not silicon, is the binding constraint
5 gigawatts is the headline. For comparison, the city of Boston peaks at roughly 2.5 gigawatts of electricity demand. The Three Mile Island reactor that Microsoft contracted from Constellation Energy in 2024 produces 835 megawatts at full output. Anthropic's contracted capacity is six Three Mile Islands, dedicated to one company's models.
Alphabet has not disclosed where these 5 gigawatts will physically live. Google's own data center power footprint was 25.9 terawatt-hours in 2023, per its sustainability report, which works out to roughly 3 gigawatts of continuous load. The Anthropic contract alone, when fully ramped, would double Google's largest energy consumer footprint. The siting question matters because Maine just banned data centers over 50 megawatts, Virginia's Loudoun County paused new interconnects, and the Southern Company queue in Georgia is closed through 2028. The TPU advantage on price means nothing if there is nowhere to plug them in.
3. The TPU bet against Nvidia is now public
Anthropic still uses Nvidia GPUs. AWS Trainium too. But this contract is dedicated TPU capacity, and the cost differential reported is striking. If Google can deliver Claude training and inference at 40 to 50 percent of Nvidia-equivalent cost, every other model lab eventually has to ask whether closed-loop hyperscaler chips are the better long-run trade.
That is the structural threat to Nvidia. Not that TPUs displace H100s and B200s in the next twelve months, but that the price floor under Nvidia margins moves. Nvidia's gross margin sat at 73 percent last quarter. Google Cloud's TPU instances are priced to win business. Somewhere between those two numbers is the new equilibrium, and equity markets are already pricing it. Alphabet's forward P/E of 28 versus Nvidia's 24 tells you who Wall Street thinks owns the next decade of compute margin.
What Happens Next
Most likely scenario, six to nine months. Alphabet briefly overtakes Nvidia by market cap, probably in Q3 2026, on the back of Nvidia's first soft data center quarter combined with Google Cloud accelerating. Anthropic raises the Series H at somewhere between $700 and $900 billion. Revenue passes $50 billion run rate by year end. The IPO Bloomberg has reported for October slips to early 2027 because the Series H closes faster and at higher valuation than public markets are ready to price. The Google contract starts ramping in 2027 as scheduled, with year one compute spend around $25 billion as ramp builds.
Bull case. Anthropic hits $80 billion revenue run rate by Q1 2027 on agent workloads scaling. Claude becomes the default model for enterprise coding, displacing both Copilot and OpenAI in that vertical. The $40 billion average annual TPU bill stops looking aggressive and starts looking conservative. Alphabet's backlog crosses $700 billion. Google Cloud margin pushes above 40 percent. Nvidia keeps growing but at half the speed, and Alphabet becomes a $6 trillion company.
Bear case. Open-source frontier models from DeepSeek and Moonshot keep collapsing the price Anthropic can charge per token. Enterprise customers consolidate on whichever lab offers the cheapest Sonnet equivalent. Anthropic's revenue growth slows to 3x year on year instead of 5x. The Series H gets done at $600 billion, not $900 billion. Two years in, Anthropic asks Google to renegotiate the back end of the TPU commitment. Alphabet's backlog mark goes from a clean number to a footnote about contracted minimum drawdown. The shares give back 20 percent and the gap with Nvidia widens again.
Wild card. Anthropic accepts an acquisition offer. The names that could afford it are Alphabet, Apple, Microsoft, and Saudi Arabia's PIF. Apple has the cash and the strategic motivation. Tim Cook spent the May earnings call hinting at "a range of options" on AI M&A. A $1 trillion Anthropic bid from Cupertino would solve Apple's AI problem and would also reset the TPU contract overnight. The probability is small, but every hyperscaler has already war-gamed it.
What To Watch
Alphabet's Q2 2026 earnings call, late July. The two numbers to circle are the cloud backlog and the Google Cloud operating margin. If backlog crosses $500 billion and margin holds above 30 percent, the Anthropic deal is already showing up cleanly in the numbers. If margin compresses because TPU ramp costs front-load, the bull thesis softens.
Nvidia's May 20 earnings. Specifically, the data center revenue beat versus the $66 billion implied for the segment, and Jensen Huang's commentary on hyperscaler pricing pressure. Any acknowledgment that customers are routing inference through TPUs at scale is the data point that moves the stock.
Anthropic's Series H closing terms. Reported size, reported valuation, and identity of lead investor. If Saudi PIF or Mubadala lead at $900 billion, it is a different signal than Lightspeed or Sequoia anchoring at $700 billion. Sovereign money signals geopolitical positioning. Venture money signals normal market clearing.
Power interconnect filings in any state where Google has bought land in the last twelve months. Texas, Iowa, Oklahoma, South Carolina, and Tennessee are the most likely 5 gigawatt sites. ERCOT queue postings and PJM data are public. Track them.
OpenAI's response. Sam Altman has Microsoft, Oracle, CoreWeave, SoftBank, and now reportedly AWS lined up for compute. If OpenAI signs a similar megacontract with Oracle or Amazon in the next sixty days, it is direct competitive matching. If it does not, OpenAI starts to look compute-constrained relative to Anthropic for the first time in the cycle.
My Opinion
This deal is the inflection point where the AI capex cycle stops being a story about Nvidia and becomes a story about who controls the stack. Google has spent ten years building TPUs as an internal cost reduction tool. With Anthropic, it has turned them into an externally salable cloud product priced 40 to 50 percent below the Nvidia alternative. That is a real moat, and it is the kind of moat that the market eventually pays a $5 trillion multiple for. The 160 percent rally in Alphabet shares is not euphoria. It is the market repricing the company on its full AI stack, and the repricing is not done.
The risk is not Anthropic's revenue. The risk is power and physical buildout. 5 gigawatts is a sovereign-scale infrastructure project, and Google has agreed to deliver it on a timeline that makes the Saudi Vision 2030 capex schedule look casual. If Alphabet hits any meaningful delay in 2027 or 2028, the deal converts from accelerator to drag. The TPU advantage on price requires the TPUs to be installed, powered, cooled, and earning revenue. Each of those steps faces real frictions the market is currently choosing to ignore.
The bigger structural point is this. Anthropic and OpenAI together now anchor close to a trillion dollars of forward cloud revenue across four hyperscalers. The cloud business model used to be diversified across millions of customers paying for general workloads. It now looks more like the early 2010s telecom model, where two or three carriers underwrote the bulk of equipment-vendor backlogs. That worked until the carriers slowed capex. AI's version of that risk is one model lab missing a generation, or one open-source release collapsing pricing, or one regulator pulling on a single contract. The contracts are too big to cancel, and that is exactly what makes them dangerous.