Zapier as Chat-to-Action Layer

Diving deeper into

Mike Knoop, co-founder of Zapier, on Zapier's LLM-powered future

Interview
everyone is going to have sort of chat-based augmented interface, and every search box is going to be powered by embeddings.
Analyzed 4 sources

This shift pushes software toward two new default entry points, ask the product what to do, or type a fuzzy search and let the system infer intent. For Zapier, that matters because its NLA API already turns plain language into real app actions, and its design work, auth, parameter filling, safety controls, and payload cleanup are the missing plumbing that makes chat and embedding based search actually usable inside products.

  • Zapier built NLA as a universal action layer for thousands of app actions. A user or bot sends one plain language instruction, Zapier maps it to the right API call, signs it with the user’s existing auth, runs the action, then trims the response to a short human readable payload that another model can safely use.
  • The point is not that every product becomes a full chat app. The stronger pattern is that chat becomes an onboarding and intent capture layer, while structured UI still handles repetitive work faster. Mike Knoop explicitly frames chat as additive, not a replacement for point and click software like Salesforce.
  • Embedding powered search means search stops depending only on exact keywords and starts matching meaning. That is why AI search products like Perplexity and Exa have gained traction, they return direct answers or semantically relevant documents when users ask long, messy questions instead of typing perfect keywords.

The likely endpoint is software that starts with natural language, then materializes the right interface, form, dashboard, widget, or workflow once intent is clear. That favors companies like Zapier that already sit between user intent and app execution, because they can power both the chat front door and the action layer underneath it.