Friend Absorbs AI Inference Costs
Diving deeper into
Friend
The company absorbs ongoing LLM inference costs instead of passing them to customers, fixing user costs at the purchase price.
Analyzed 4 sources
Reviewing context
This pricing choice turns Friend into an insurer of its own AI usage. Every extra conversation, check in, or supportive nudge creates compute cost with no matching revenue after the initial $99 to $129 sale, so the business only works if Friend keeps prompts cheap, keeps usage bounded, and uses the no subscription promise to win buyers away from rivals that meter access through monthly plans.
-
The cleanest comparison is Limitless, which also sold a $99 pendant but kept a software paywall, with free users capped and Pro users paying $20 per month for unlimited AI. That model lets heavy users fund their own inference, while Friend takes that burden onto its own P&L.
-
In wearables more broadly, Oura uses the opposite structure. It sells the device, then charges $6 per month for deeper insights, using recurring software revenue to fund ongoing product and AI development. Friend is giving up that annuity stream in exchange for a simpler consumer pitch.
-
The market signal so far is that standalone AI pendants have struggled when hardware economics and ongoing cloud costs collide. Humane was discontinued, Limitless was absorbed by Meta, and Friend later shifted toward software, showing how hard it is to support an always on AI device from one time hardware revenue alone.
The path forward is to make the fixed price model a wedge, not the whole business. The strongest version of Friend uses low friction hardware to acquire users, then adds higher margin software, premium hardware tiers, or wellness features that raise lifetime revenue faster than inference costs rise.