Turing at risk of disintermediation
Turing
Turing is trying to become the labor and workflow layer around frontier models, but its biggest buyers are also building the default path enterprises may use to buy AI. Turing grew to an estimated $300M of revenue in 2024 by shifting from remote developer staffing into training data, evaluations, and enterprise AI work for labs including OpenAI, Google, and Anthropic. That makes the same customers who drove Turing's growth the companies most able to bundle models, tools, and delivery partners into a full enterprise stack.
-
Turing already sells more than contractor hours. It uses its network of 4M plus engineers and experts to generate coding and reasoning data, run evals, and staff AI pods. That broadens revenue per account, but also pushes Turing into the same post training and deployment layer where labs are expanding their own tooling and services.
-
The market is moving toward bundled delivery. OpenAI launched a flagship program with Accenture around ChatGPT Enterprise and enterprise deployment. Google Cloud positioned Gemini Enterprise with partners including Capgemini and McKinsey to handle planning, deployment, and custom agent development. That reduces room for an independent middle layer vendor.
-
Comparable vendors are racing the same direction. Fleet, Mercor, and Handshake all started from human labor or data supply and are moving up into environments, benchmarks, and enterprise agent workflows. In practice, the winner is less likely to be the cheapest labor marketplace and more likely to be the vendor that turns one off services work into sticky software, eval infrastructure, and repeatable vertical products.
The next phase is a race to become indispensable before the model labs and global consultancies close the stack. Turing's strongest path is to productize its lab and enterprise work into packaged systems, verifiers, and vertical AI workflows that are hard for OpenAI, Google, Anthropic, or a services incumbent to swap out with a partner bundle.