OpenAI API Enabled Rapid AI Apps
Dave Rogenmoser, CEO and co-founder of Jasper, on the generative AI opportunity
OpenAI turned frontier models from an infrastructure project into a consumable input, and that is what created the modern AI application layer. Instead of training a model, renting GPUs, and operating inference servers, a startup like Jasper could call an endpoint, pay per generation, and spend its time on prompts, templates, editing flows, and distribution. That shift is why AI writing apps could launch fast and reach real revenue almost immediately.
-
OpenAI launched its first commercial API in June 2020 and said it had already received tens of thousands of applications by September 2020. A general text in, text out interface meant developers could test many use cases without building model infrastructure first, which sharply widened the number of teams able to ship products.
-
Jasper is a clear example of the unlock. It started on vanilla GPT-3, then improved output with prompt engineering, templates, customer ratings, and later fine-tuned models hosted by OpenAI. Jasper reached $42.5M ARR in its first 12 months and was expected to cross $75M in 2022, showing how much value could be created above the model layer.
-
The second unlock was open models. Stability AI says Stable Diffusion was publicly released on August 22, 2022, letting developers download and build on top of text to image models for free. That changed the market from one dominant hosted provider into a much broader ecosystem of apps, model hosts, and developer tools.
This model spread is what pushed AI companies toward owning workflow, data, and distribution instead of just model access. As foundation models become easier to buy or download, the durable winners are the ones that sit inside daily work, collect feedback, and turn generic model output into something tuned to a specific job and customer.