What OpenAI’s Stargate Issues Could Teach Anthropic

Over the past year, the running joke in the data center industry—even inside OpenAI—has been that the $500 billion Stargate initiative was less of a master plan than a “vibe.” My reporting this weekend showed how little Stargate resembles the $500 billion effort touted at a White House event just over a year ago. Stargate now has no staff, nor does it have any obvious role in data center development. It has ended up as a flexible umbrella for OpenAI’s compute partnerships. Longtime readers may remember that in March 2024, we reported that “Stargate” then referred to OpenAI’s plan to build a $100 billion supercomputer with Microsoft. This all reflects a deeper tension between AI labs’ ambitions and the costly realities of building and funding infrastructure. OpenAI’s evolution might offer lessons for its rivals, such as Anthropic, as they chase their own multigigawatt build-outs . Vision First, Financing Later When Stargate was announced, its three partners, OpenAI, Oracle and SoftBank, threw out a big number: $500 billion of investment for 10 gigawatts of computing capacity. But the companies didn’t yet know how the build-out was going to be funded. As the plan evolved, OpenAI leaned more heavily on Oracle, whose investment-grade balance sheet allowed its projects to borrow money at better terms than the AI lab could have managed on its own. While the public proclamation about Stargate generated excitement and momentum, eventually the project had to figure out how it could finance that vision. Anthropic, which has privately discussed its ambitions to secure roughly 10 GW of capacity over the next several years, is likely running similar calculations. It is sharing its goals with partners and potential investors to test their willingness to fund those efforts. We expect Anthropic is trying to line up an Oracle-like financial partner to help it secure favorable borrowing terms. (Unlike OpenAI, Anthropic didn’t declare its vision at a White House event, so it can work out these details behind the scenes.) Spreading Out Risk Stargate’s biggest contribution to the AI build-out might be the unusual deal OpenAI has struck with Oracle to share some of the economic risk of their 4.5 GW data center development. That essentially means OpenAI will pay more if projects are delayed or exceed the original budget—and if they turn out to be less expensive, OpenAI benefits from that too. (Given what we know about the cost of building AI, as well as how common delays are becoming, it’s likely OpenAI’s compute is only getting more expensive over time.) Such a deal structure is not typical. Cloud providers’ customers are rarely on the hook for volatility in data center construction prices, but this deal is unprecedented in scale. It’s not clear how OpenAI could afford to share the burden of these costs or how much they could potentially rise. The structure is likely a relief for Oracle investors, who have been closely watching how its fast-growing cloud business is affecting its margins—as we have reported here and here . Anthropic will likely be watching that relationship closely. It faces a similar choice about how it will spread out risk as it inks more compute deals in coming years. Going It Alone OpenAI’s experience suggests Anthropic should proceed cautiously when it comes to outright ownership of projects. Given all of the starts and stops in OpenAI’s plan to build its own data centers, Anthropic may want to kick those thoughts down the road a few years from now. We’ve reported that Anthropic is looking to increase its long-term direct leases of data center capacity (rather than just renting chips from cloud providers) in coming years to gain more control over its compute footprint. Direct leasing offers a sort of middle ground, as it can provide a business more influence over location and capacity without requiring in-house data center development capabilities. Owning infrastructure may eventually make financial sense for AI labs, but in the race to secure compute quickly, it might not be the best first option. Musk’s Turbine Trouble Elon Musk’s xAI is facing stiff opposition to its strategy of powering its data centers in Mississippi and Tennessee with dozens of small turbines burning natural gas. A public environmental hearing in Southaven, Miss., last week to consider a state air permit drew hundreds of opponents who urged the Mississippi Department of Environmental Quality to deny the request. The dispute could serve as an early test of whether big tech can sidestep traditional utilities by rapidly assembling off-grid power plants from clusters of so-called temporary turbines.— Ann Davis Vaughan In other news Cloverleaf Infrastructure—a two-year-old data center power firm founded by a former Microsoft energy executive—is fielding takeover offers. Axios first reported on the talks . We reported last week that SemiAnalysis, the chip and infrastructure research firm founded by Dylan Patel, is considering raising money to invest in startups . DG Matrix, a startup building a new kind of transformer for AI data centers, raised $60 million in a round led by Engine Ventures. New From Our Reporters Exclusive SpaceX’s Starlink Makes Land Grab as Amazon Threat Looms By Theo Wayt Exclusive OpenAI Boost Revenue Forecasts, Predicts $111 Billion More Cash Burn Through 2030 By Sri Muppidi and Stephanie Palazzolo Exclusive OpenAI Plans to Price Smart Speaker at $200 to $300, as AI Device Team Takes Shape By Stephanie Palazzolo and Qianer Liu