Workflows Not Models Drive Value
Chris Lu, co-founder of Copy.ai, on generative AI in the enterprise
Going multi-model turns the language model from the product into a commodity input, and shifts Copy.ai’s real value to workflow orchestration. That matters because enterprise buyers are not paying for access to one model, they are paying for a system that can pick the cheapest or fastest model, pull in outside data, write the output, and push it into tools like Salesforce. That makes model upgrades an immediate product improvement instead of a platform rewrite.
-
This is a direct response to margin pressure and model volatility. Earlier, Copy.ai depended heavily on OpenAI and paid per generated word, which capped margins. A multi-model layer lets it route workloads across providers as price and quality change, instead of being stuck with one supplier’s costs and latency.
-
It also supports Copy.ai’s move from writing app to GTM automation system. The product now does research, drafts outreach, and writes back into the CRM. In that setup, the durable product is the chain of steps, tool integrations, and business logic, not the underlying model alone.
-
The closest comparison is Jasper and the broader AI writing category. Both started by reselling GPT output into templates for marketers, which made switching costs low. The path out of that trap is deeper embedding into enterprise workflows, where the app owns approvals, data sources, and actions across other software.
This points toward a market where AI application companies compete less on whose model they use, and more on who owns the workflow and system of record around the model. As open source and frontier models keep improving, the winners will be the products that can swap models underneath while becoming harder to rip out of daily work.