Experimentation as Core Development Workflow
Joe Zeng, software engineer at Statsig, on using Docker
Experimentation has become a core product development workflow, not a nice to have analytics add on. At companies like Facebook, Netflix, Uber, and Spotify, teams ship features behind flags, expose only part of traffic, and measure whether retention, conversion, or engagement actually improves before rolling out broadly. Statsig is built around bringing that big tech loop, release, measure, iterate, into a packaged tool for mainstream product teams.
-
The practical job to be done is combining three steps in one place. Engineers wrap new code in feature flags, product teams define an experiment, and the system reads event data to show which version won. Statsig explicitly bundles flags, experiments, analytics, and session replay around that workflow.
-
The competitive lines are converging. Optimizely comes from experimentation and now ties feature flags to tests. Amplitude comes from analytics and lets teams launch experiments from the same behavioral data. Datadog moved in from observability by acquiring Eppo, which shows experimentation is getting pulled into broader developer and data platforms.
-
That shift creates a tailwind for Statsig because more companies now expect product changes to be measured like ad spend or sales campaigns. It also explains the positioning, Statsig is not trying to out monitor Datadog on infrastructure or out chart Amplitude on standalone analytics, it is trying to own the decision loop around product changes.
The category is heading toward unified product stacks where every release is gated, measured, and explained on the same data layer. That favors platforms like Statsig that start from experimentation and expand outward, because the winning product will be the one that makes shipping and learning feel like a single workflow instead of separate tools stitched together.