On Wednesday, xAI and Anthropic announced a surprise partnership that has the Claude-maker buying out “all of the compute capacity at [xAI’s] Colossus 1 data center,” roughly 300MW that allowed Anthropic to immediately raise its usage limits. It’s a huge deal for xAI, likely worth billions of dollars. More importantly, it immediately monetized one of the company’s most impressive accomplishments , turning xAI from a consumer to a provider of compute.

It’s tempting to see the arrangement as a shot at OpenAI amid the ongoing lawsuit. But Musk’s explanation on X was that xAI had already moved training to a newer data center, Colossus 2 , and xAI simply didn’t need them both.

In the short term, there’s an obvious logic at work. xAI’s existing products are mostly focused on Grok, which has seen plummeting usage since the image generation debacles earlier this year. If xAI’s data center buildout is that much more than what Grok needs to operate, partnering with Anthropic adds a lot of green to the balance sheet. This is especially useful as the company, now combined with SpaceX, speeds toward an IPO. More broadly, having Anthropic lined up as a customer makes it easier to believe that SpaceX’s orbital data center play might actually work .

But beyond the short-term benefit, the Anthropic partnership sends an unusual message about where Elon Musk’s priorities really lie. It suggests the company’s real business may be more about building data centers than training AI models.

It’s rare to see a major tech company treat compute resources this way when companies like Google and Meta, who are also training models, are building more data centers. It’s an easy point to miss, because so many of these companies are working as enterprise AI vendors, online services, and cloud providers all at once. But when forced to make a choice between selling more available compute to customers and preserving some to build their own tools, they reliably choose door No. 2.

Just last month, Sundar Pichai admitted on a call that Google Cloud revenue was lower than it could have been because the company was “capacity constrained” — and when given the choice of renting out their GPUs or using them to develop AI products, Google chose the AI products.

Facebook has faced a more extreme version of the same constraint, spinning up an entirely new cloud apparatus just to ensure they would have enough GPU power to chase Mark Zuckerberg’s AI ambition. As he put it when announcing Meta Compute in January, “How we engineer, invest, and partner to build this infrastructure will become a strategic advantage.”

The key word there is “strategic.” Both Zuckerberg and Pichai are looking toward a future where AI is powering the most popular and lucrative systems in the world. Computing power isn’t just a way to satisfy today’s inference demand, but to build tomorrow’s products — and running short on compute means missing out on that chance.

By focusing on data centers (earthbound and otherwise), xAI is positioning itself more like a neocloud business: buying GPUs from Nvidia and renting them out to model developers like Anthropic. It’s a far more difficult business, squeezed by both chip suppliers and the shifting cycles of demand. The valuations for most active neoclouds reflect that reality: xAI was valued at $230 billion in its January funding round; CoreWeave, which oversees a comparable quantity of computing power, is worth less than a third of that .

Musk’s version of a neocloud is more ambitious, as you might expect. Some of the data centers might be in space — at least by 2035, if things go according to plan. xAI will be making its own chips at the Terafab , which will take away some but not all of Nvidia’s pricing power. But none of it changes the basic economics of the neocloud business.

As recently as the February all-hands , xAI had real ambitions in software. That was the presentation that unveiled the orbital data center project, but it also teased significant ambitions in coding (since bolstered by the Cursor partnership ) and interesting ideas like leveraging computer use into full-scale digital twins (in the unfortunately named Macrohard project). These are the kind of long-horizon projects that need committed computing resources to succeed. As long as xAI is selling large quantities of compute to its competitors, it’s hard to think such new ambitions have much of a future.