Enterprise LLMs Chosen by Infrastructure Fit

Diving deeper into

Towaki Takikawa, CEO and co-founder of Outerport, on the rise of DevOps for LLMs

Interview
While consumer space is dominated by ChatGPT and Claude, enterprise adoption shows a different distribution.
Analyzed 3 sources

Enterprise LLM buying is already behaving like infrastructure, not like a consumer app popularity contest. Big companies usually pick the model layer that fits their cloud contracts, security rules, and cost targets, which pushes usage toward Gemini, Azure AI, Bedrock, and self hosted models rather than just the consumer leaders. That is why model distribution inside enterprises is broader, more operational, and more tied to deployment plumbing than brand preference.

  • In practice, many enterprise teams do not standardize on one model. Ramp routes work by job type, using GPT-4 when output quality matters most, Claude for faster synchronous tasks, and local models when speed and cost matter more. That kind of workflow naturally fragments enterprise share across providers.
  • A large part of enterprise demand comes from regulated and private deployments. Cohere built its business around private cloud and on premises installs for customers like Oracle, RBC, Fujitsu, and LG, with about 85% of revenue tied to private deployments rather than public API traffic.
  • The tooling stack is also pulling enterprises toward open models and cloud platforms. Outerport describes a market where companies increasingly want direct control over deployment, telemetry, versioning, and GPU memory management for self hosted models, especially as LLM systems become multi model and more production critical.

This points toward an enterprise market where the winning vendors are the ones that fit into existing IT and cloud workflows, or let companies run specialized models on their own infrastructure. As more workloads move from simple chat to multi step agents and internal automation, enterprise usage should keep shifting toward multi model stacks, cloud bundled offerings, and open deployments.