Agent Identity as a Web Primitive
Michael Grinich, CEO of WorkOS, on AI startups getting enterprise-ready at launch
This points to a new control layer for the web, where sites can tell the difference between a user approved agent and a suspicious bot. The practical shift is from guesswork, like IP lists and fingerprinting, toward cryptographic identity. Browserbase is building the browser side of that stack, letting hosted agents prove who they are so a site can allow them through, reduce CAPTCHAs, or apply special rules instead of blocking all automation.
-
Browserbase is not just an automation framework. It hosts cloud Chromium sessions, gives developers tools like Stagehand, and is positioning Browserbase Identity as a way for remote browsers to be recognized as legitimate agent traffic by Cloudflare protected sites.
-
Web Bot Auth matters because the old method was fingerprinting known agents after the fact. Stytch describes moving from trying to recognize Browserbase, OpenAI, or Anthropic traffic heuristically to using cryptographic signatures that can self validate and update more reliably.
-
The bigger market context is that browser agents are becoming the fallback for software with no API. In that stack, APIs are still best when they exist, but Browserbase and similar infrastructure matter because many real workflows still depend on clicking through legacy web apps at scale.
Over the next few years, agent identity will likely become a standard web primitive, much like login and OAuth did for human users and apps. That gives infrastructure companies a new wedge. The winners will be the ones that combine permissioned identity, reliable hosted browsers, and enterprise controls so agents can act on the web without being treated like anonymous scraping traffic.