Workflow Drives AI Customer Support Moat
Eoghan McCabe & Des Traynor, CEO and CSO of Intercom, on the AI transformation of customer service
The real moat in AI customer support sits in workflow software, not in the model call. A simple bot can answer from public docs, but a platform bot can decide when to reply, what customer data it is allowed to use, when to hand off to a human, how to trigger an action like checking an order, and how to learn from every resolution inside the same inbox, help center, and reporting stack.
-
Intercom frames chatbot history in three stages. Scripted phone tree bots handled narrow, high volume cases. Resolution Bot matched questions to prewritten answers but needed heavy setup. LLM bots removed much of that setup, but only become durable products when wrapped in routing, permissions, analytics, and handoff logic.
-
This is why the product boundary expands fast. Intercom describes tone controls, customer tier based behavior, private versus public knowledge access, custom user attributes, and agent assist as core product work. That is standard SaaS engineering around an LLM, but it is also the layer where switching costs and margin can accumulate.
-
The competitive split is visible in the market. AI native agent startups like Sierra and Decagon push hard on automated resolution, while Intercom packages AI inside a full customer service system with ticketing, knowledge base, workflows, reporting, and human copilots. That makes the comparison less about bot quality alone and more about system depth.
The category is heading toward integrated systems that own both the bot and the operating layer around it. As model quality converges and costs fall, value will move toward orchestration, proprietary conversation data, action taking, and usage based pricing tied to resolved outcomes. The winners are likely to look less like thin AI add ons and more like full customer service operating systems.