Cloud Oligopoly Shapes LLM Race
Anthropic at $316M ARR
This market is consolidating around cloud backed camps, not just around model quality. The labs need huge amounts of compute, distribution, and enterprise procurement access, so they are pairing with the hyperscalers that can supply GPUs or custom chips, put the models into existing cloud catalogs, and pull them into large customer accounts. That makes the LLM race look a lot like the old AWS, Azure, and Google Cloud map, just one layer higher in the stack.
-
Anthropic’s Amazon tie up was not just financing. In September 2023, Amazon agreed to invest up to $4B, Anthropic named AWS its primary cloud provider for mission critical workloads, and Claude expanded on Bedrock. That means capital, training capacity, and distribution all moved together.
-
Microsoft followed the same playbook with Mistral. Their February 2024 partnership put Mistral models on Azure and gave Mistral access to Azure infrastructure and Microsoft’s enterprise channel. In practice, that lets Mistral show up inside accounts that already buy from Azure.
-
Anthropic first aligned with Google before Amazon deepened the relationship. Google backed Anthropic in 2023, later committed up to $2B, and tied that financing to cloud usage. The result is a multi cloud posture on paper, but with the same strategic logic as cloud oligopoly alliances.
Going forward, the winners are likely to be the labs that become default model suppliers inside major cloud buying motions. That favors companies that can bundle frontier models with cloud infrastructure, fine tuning, security, and billing, and it pushes the market toward a small number of deeply entrenched model and cloud pairings.