Deployment Control Drives Enterprise LLM Adoption

Diving deeper into

Towaki Takikawa, CEO and co-founder of Outerport, on the rise of DevOps for LLMs

Interview
most enterprise adoption comes from either enterprise cloud providers or open-source/on-premises models
Analyzed 7 sources

Enterprise buyers are choosing deployment control over raw model brand. In practice that means either buying from AWS, Azure, and Google, where security, identity, billing, and compliance already fit existing IT workflows, or running open models on infrastructure they control so prompts, weights, and customer data stay inside their own boundary. Custom models matter most when a company needs lower inference cost, tighter behavior, or stronger privacy than a shared API can offer.

  • Security and privacy are a big reason custom and self hosted models persist. Outerport describes the need for encrypted model transport and points to confidential computing, where only approved hardware can decrypt and run a model, as the long term path for protecting both model weights and sensitive data during execution. NVIDIA already supports this on Hopper class GPUs such as H100.
  • On premises is not just about where the model runs. It is about who controls the full stack, including storage, network access, identity, audit logs, and update policy. That is why enterprise AI platforms like DataRobot and H2O.ai sell self managed or air gapped deployments for regulated customers, while Azure and Bedrock emphasize that prompts and fine tuning data are not used to train base models without permission.
  • Outerport sits one layer below the model provider. It is not a frontier model company or a fine tuning studio. It is deployment infrastructure for teams running large open or custom models across CPUs and GPUs, focused on reducing cold starts, swapping models in memory, and making multi model pipelines practical on owned hardware, cloud GPUs, or edge devices.

The market is heading toward a split stack. Hyperscalers will keep winning customers that want managed compliance and easy procurement, while open and custom model deployments will grow anywhere latency, cost, or data control matters enough to justify operating infrastructure. That is the ecosystem Outerport is building into, as the control plane for getting those large models loaded, updated, and served reliably across mixed hardware.