Live Video as Interactive Infrastructure

Diving deeper into

Adam Brown, co-founder of Mux, on the future of video infrastructure

Interview
where live gets a lot more interesting is when you talk about interactivity
Analyzed 5 sources

This is the point where live video stops being a cheaper TV channel and starts becoming application infrastructure. Mux is not just trying to move video faster, it is trying to make live streams responsive enough that chat, audience actions, and stream metadata can shape what happens on screen while the event is still happening. That matters because once video becomes two way, developers need APIs and monitoring, not a basic broadcast tool.

  • Mux built live and recorded video on the same underlying stack, with the same workers handling transcoding. That makes low latency and interactive features easier to add across both modes, instead of treating live as a separate product with separate engineering and operations.
  • The practical threshold here is not zero delay, it is getting latency down to a few seconds so a viewer can send a chat, vote, or response and have it still feel connected to the broadcaster. Mux said its LL-HLS rollout was delivering roughly 4 to 7 seconds in major regions, which is close enough for Q&A, creator chat, and guided live events.
  • This sits in a broader shift where video infrastructure vendors are competing on abstraction and developer control, while higher layer products like Wistia package finished workflows. Adjacent platforms like Cloudflare also pushed LL-HLS for chat and Q&A, which shows that low latency was becoming table stakes, while differentiation moved to what developers could build on top.

The next step is live video becoming event driven software. The winning platforms will bundle delivery, analytics, identity, and metadata so developers can build auctions, shopping, classes, creator tools, and AI guided experiences where the stream reacts in real time, instead of just playing back smoothly.