From Setup to Autonomous Orchestration

Diving deeper into

Mike Knoop, co-founder of Zapier, on Zapier's LLM-powered future

Interview
Everyone is starting with the setup flow because it's safe; there's a human in the loop.
Analyzed 4 sources

The setup flow is where AI earns permission to graduate from assistant to operator. In Zapier, that first step means the model can turn a plain English request into a concrete action, but a person still chooses the app account, approves access, and can lock key fields like a Slack channel or whether Gmail should draft or send. That makes errors cheap, visible, and reversible while trust is still being built.

  • Zapier designed Natural Language Actions so the hard part happens before execution. The model fills in parameters from text, but users explicitly expose which actions ChatGPT can use, can preview actions before commit, and can override guessed fields. That product shape is why onboarding moved first, before fully autonomous runs.
  • The deeper opportunity is not chat based setup, it is safer execution. By 2025, Zapier was describing the winning pattern as deterministic workflows wrapped around LLM steps, where software moves data in fixed ways and only uses models at judgment points. That lowers error rates and gives teams approval checkpoints where needed.
  • This also explains the competitive line between Zapier and model labs. OpenAI owns the general chat entry point, while Zapier owns the messy operational layer, app auth, action controls, field mapping, and long tail integrations across thousands of apps. The company with the better safety rails is better positioned to let AI actually do work in the background.

The next phase is automation that runs mostly out of sight, with humans approving only the risky edges. As models improve, setup becomes less of the product and orchestration becomes more of it, meaning Zapier shifts from helping users describe workflows to supervising fleets of semi autonomous software workers across the SaaS stack.