Jamstack as Edge Application Runtime

Diving deeper into

Thom Krupa, co-founder of Bejamas, on building dynamic apps on the Jamstack

Interview
You could have the whole stack running at the edge.
Analyzed 5 sources

This points to Jamstack turning from a static site pattern into a full application runtime. The key shift is that HTML rendering, authentication checks, request routing, personalization, and some data reads can happen in the same edge location near the user, instead of splitting the app between a CDN for pages and a distant origin server for logic. That cuts round trips and makes dynamic apps feel much closer to static ones.

  • The practical win is not just speed. It also removes build bottlenecks. Instead of prebuilding huge numbers of pages and waiting on long deploys, developers can render on request at the edge and still keep global distribution and caching.
  • The stack is only truly at the edge if data moves too. Vercel notes that edge code still slows down when the database sits far away, which is why edge databases, key value stores, and globally replicated data layers matter as much as edge functions.
  • This is why Vercel, Netlify, and Cloudflare all converged on similar building blocks. Vercel supports edge runtime functions, Netlify says entire apps can run on Edge Functions, and Cloudflare bundles compute with KV, D1, Durable Objects, and queues on the same network.

The direction is toward frontend platforms becoming lightweight full stack clouds. The winners will be the platforms that make global compute and nearby data feel like the default path, so developers can build personalized, always dynamic apps without managing separate frontend hosting, backend servers, and data infrastructure.