Lambda Labs coopetition with hyperscalers
Lambda Labs
This reveals that hyperscalers are becoming overflow buyers of neocloud capacity, not just dominant rivals. Lambda sells the thing Microsoft and the other big clouds sometimes cannot produce fast enough on their own, large blocks of Nvidia based training capacity with the right networking and deployment speed. That lets Lambda compete for AI infrastructure dollars while also plugging directly into hyperscaler demand when Azure is supply constrained.
-
The pattern already showed up with CoreWeave. Microsoft became CoreWeave’s biggest customer through a $10B compute deal even while Azure competed in the same market. Lambda’s November 3, 2025 multibillion dollar Microsoft agreement is the same frenemy structure, a specialist GPU cloud selling capacity into a larger cloud that needs more AI supply immediately.
-
In practice, Lambda and hyperscalers often split workloads. One large Lambda customer uses Lambda for reserved training clusters with high quality InfiniBand networking, while using AWS for inference and production services. That is the coopetition, Lambda wins where custom training clusters and price matter, hyperscalers win where mature storage, uptime, and broader cloud tooling matter.
-
CoreWeave and Lambda occupy different slices of this market. CoreWeave scaled earlier into hyperscaler and giant enterprise contracts, reaching $1.9B revenue in 2024 and 250,000 plus GPUs across 33 facilities by mid 2025. Lambda was smaller at $505M annualized revenue in May 2025, but positioned around flexibility, smaller slices of capacity, and developer friendly cluster workflows.
Going forward, the winners in GPU cloud will increasingly be the companies that can act as both independent cloud and wholesale capacity partner. As AI demand outruns power, data center, and chip deployment cycles, Lambda has room to grow by becoming part supplier to hyperscalers, part destination for developers and research teams that want Nvidia clusters without buying from Azure, AWS, or Google directly.