
Funding
$1.70B
2025
Valuation
Nscale raised the largest Series B in European history at $1.1 billion in September 2025, led by Aker ASA with participation from NVIDIA, Dell Technologies, Fidelity Management & Research Company, Nokia, G Squared, Sandton Capital Partners, Blue Owl Managed Funds, Point72, and T.Capital.
In early October 2025, Nscale closed a $433 million pre‑Series C SAFE from NVIDIA, Nokia, Dell, Blue Owl, and other new and existing investors—bringing total funding to over $1.7 billion.
The company previously raised a $155 million Series A in December 2024 and a $30 million seed round in December 2023. The Series B represents a significant step-up in valuation, though the specific post-money valuation has not been disclosed.
Product
Nscale positions itself as an AI-native hyperscaler that owns the entire physical and software stack needed to train, fine-tune, and serve modern AI models.
The company builds GPU-optimized data centers in northern Norway that leverage 100% renewable hydroelectric power and natural cooling. These AI factories feature high-density liquid cooling and modular prefabricated construction that enables power densities exceeding 100 kW per rack needed for H100 and MI300 GPU nodes.
Nscale offers two primary compute products. AI Private Cloud provides reserved clusters where customers can book bare-metal nodes or full clusters with three control plane options: Slurm scheduler for large batch training, NKS Kubernetes Service for containerized microservices, or raw bare-metal for maximum performance.
The Serverless Inference platform launched in April 2025 exposes chat, embeddings, vision, and image generation endpoints that mirror the OpenAI API. Customers pay only for tokens processed with no idle GPU costs.
The company provides a browser console, comprehensive documentation, and a cross-platform CLI with shell autocompletion and Homebrew installer for developer-friendly access. All capacity is marketed as carbon-neutral and located within the European Economic Area for GDPR compliance and sovereign AI workloads.
Business Model
Nscale operates a vertically integrated B2B infrastructure model that owns the entire stack from power generation to GPU clusters. The company builds and operates its own data centers rather than leasing capacity, giving it control over costs and sustainability.
Revenue comes from two primary streams: reserved private cloud clusters sold on subscription basis and serverless inference charged on consumption. Private cloud customers pay for dedicated GPU nodes or full clusters, while serverless customers pay per token processed.
The vertical integration strategy allows Nscale to capture more value across the stack while offering competitive pricing. By owning renewable power sources and optimizing data center design, the company can achieve lower operating costs than competitors relying on grid power and leased facilities.
Strategic partnerships amplify the business model's reach. The joint venture with Aker and OpenAI for Stargate Norway provides anchor tenancy and credibility, while alliances with Singtel and Open Innovation AI create distribution channels across Southeast Asia and the Middle East.
The model scales through geographic replication of the hydro-powered data center design across similar Nordic and Canadian locations. Modular construction and standardized designs enable faster deployment compared to traditional data center builds.
Competition
Vertically integrated players
CoreWeave leads this category with contracts for up to 600,000 NVIDIA GPUs and a $6.3 billion take-or-pay agreement that provides downside protection. The company is acquiring Core Scientific to secure 1.2 GW of power capacity.
Crusoe differentiates by bringing compute to stranded energy sources like flare gas and renewable sites, claiming power costs 30% below market rates. IREN is pivoting from Bitcoin mining to AI with 8,500 Blackwell GPUs at a 50 MW Canadian facility.
Nebius positions itself as the first European provider of Blackwell Ultra instances, while Northern Data operates as a GPU landlord to hyperscalers with 250 MW of hydro-powered capacity in Sweden at pricing 20-30% below AWS.
Traditional hyperscalers
AWS, Microsoft Azure, and Google Cloud are rapidly adding Blackwell instances and DGX Cloud capabilities to their existing infrastructure. These incumbents benefit from existing customer relationships and integrated service offerings but face constraints on GPU supply allocation.
The hyperscalers compete on convenience and ecosystem integration rather than raw performance or sustainability credentials. Their challenge is securing sufficient GPU inventory while maintaining margin profiles across diverse workloads.
Specialist GPU clouds
Lambda Labs, Vultr, and Paperspace focus on developer tooling and competitive pricing while leasing or co-locating capacity rather than owning infrastructure. These players differentiate through ease of use and specialized workflows for AI development.
Regional sovereign providers like Scaleway, Domyn, and OVHcloud leverage EU policy support and EuroHPC programs to compete on data residency requirements. They target customers with strict regulatory or sovereignty constraints that rule out US-based alternatives.
TAM Expansion
New products
Nscale launched Fine-tuning-as-a-Service in August 2025, enabling customers to run serverless supervised fine-tuning, direct preference optimization, and generalized reward preference optimization jobs. This moves the company up-stack from raw compute to higher-margin ML tooling.
The forthcoming AI Marketplace will create a menu of consumption-based services that capture wallet share beyond initial model training spend. By layering software on top of owned hardware, Nscale follows the hyperscaler playbook of expanding from infrastructure into platform services.
Sovereign cloud and on-premises edge offerings acquired through Kontena allow Nscale to sell modular, high-density GPU pods to government and regulated industry customers that cannot use shared public cloud infrastructure.
Customer base expansion
Telecommunications operators represent an emerging buyer segment through the Singtel alliance, which cross-exposes GPU pools and orchestration platforms across Europe and Southeast Asia. This effectively turns telecom operators into resale channels for Nscale capacity.
Enterprise software platforms like Open Innovation AI embed Nscale GPUs behind the scenes with a joint roadmap targeting 30,000 GPUs for large Middle East enterprises over three years. This B2B2C model expands reach without direct sales efforts.
The Microsoft contract demonstrates Nscale's ability to land hyperscale tenant deals where data sovereignty or green energy requirements favor European providers over US megaclouds.
Geographic expansion
The capacity pipeline totals 1.3 GW with 250 MW committed by Q4 2026 and over 1 GW by 2029 across Norway, the UK, continental Europe, and greenfield US sites. The 60 MW Glomfjord expansion going live in Q2 2025 showcases the ability to replicate hydro-powered designs.
Partnerships provide market entry advantages in Southeast Asia through Singtel and MENA through Open Innovation AI, reducing friction compared to owning every data center location. This asset-light expansion model preserves capital while accessing new geographies.
Risks
GPU supply constraints: Nscale's growth depends on securing sufficient NVIDIA H100, B200, and future Blackwell GPU allocations in a supply-constrained market where larger competitors like CoreWeave have secured massive take-or-pay agreements. Any shortfall in GPU deliveries could delay capacity expansion and revenue recognition from committed customer contracts.
Power infrastructure dependencies: The business model relies heavily on access to cheap, renewable hydroelectric power in specific Nordic locations. Climate variability affecting hydroelectric generation, regulatory changes in power markets, or infrastructure failures could significantly impact operating costs and the company's core sustainability value proposition.
Hyperscaler competition: As AWS, Microsoft, and Google Cloud rapidly expand their own GPU offerings and secure direct relationships with NVIDIA, they could commoditize AI infrastructure and squeeze margins for independent providers. Their existing customer relationships and integrated service ecosystems create significant competitive advantages that pure-play infrastructure companies struggle to match.
News
DISCLAIMERS
This report is for information purposes only and is not to be used or considered as an offer or the solicitation of an offer to sell or to buy or subscribe for securities or other financial instruments. Nothing in this report constitutes investment, legal, accounting or tax advice or a representation that any investment or strategy is suitable or appropriate to your individual circumstances or otherwise constitutes a personal trade recommendation to you.
This research report has been prepared solely by Sacra and should not be considered a product of any person or entity that makes such report available, if any.
Information and opinions presented in the sections of the report were obtained or derived from sources Sacra believes are reliable, but Sacra makes no representation as to their accuracy or completeness. Past performance should not be taken as an indication or guarantee of future performance, and no representation or warranty, express or implied, is made regarding future performance. Information, opinions and estimates contained in this report reflect a determination at its original date of publication by Sacra and are subject to change without notice.
Sacra accepts no liability for loss arising from the use of the material presented in this report, except that this exclusion of liability does not apply to the extent that liability arises under specific statutes or regulations applicable to Sacra. Sacra may have issued, and may in the future issue, other reports that are inconsistent with, and reach different conclusions from, the information presented in this report. Those reports reflect different assumptions, views and analytical methods of the analysts who prepared them and Sacra is under no obligation to ensure that such other reports are brought to the attention of any recipient of this report.
All rights reserved. All material presented in this report, unless specifically indicated otherwise is under copyright to Sacra. Sacra reserves any and all intellectual property rights in the report. All trademarks, service marks and logos used in this report are trademarks or service marks or registered trademarks or service marks of Sacra. Any modification, copying, displaying, distributing, transmitting, publishing, licensing, creating derivative works from, or selling any report is strictly prohibited. None of the material, nor its content, nor any copy of it, may be altered in any way, transmitted to, copied or distributed to any other party, without the prior express written permission of Sacra. Any unauthorized duplication, redistribution or disclosure of this report will result in prosecution.