Ramp's AI Flywheel for Extraction
Geoff Charles, VP of Product at Ramp, on Ramp's AI flywheel
The strategic point is that LLMs turned document understanding from a specialized vendor cost center into a cheap native feature inside Ramp. Instead of paying outside OCR vendors to read invoices and receipts, Ramp can use general models to pull out vendor names, amounts, line items, and contract terms closely enough to vendor grade accuracy, then bundle that into bill pay, expense review, and vendor management where the economic upside is much larger.
-
The real comparison is not just model accuracy, but workflow economics. Veryfi charges around $0.16 per invoice on its starter plan, while AWS task pricing shows human review and labeling costs stack up fast in older human in the loop systems. Ramp argues generic LLMs are now good enough that this specialized extraction layer no longer needs a dedicated vendor markup.
-
That matters because extraction is the gateway step to higher value products. Once Ramp can read a contract or invoice cheaply, it can flag minibar charges, surface renewal dates, unify card and AP vendors, and draft negotiation emails. The customer is not buying OCR, they are buying fewer manual reviews and more savings on spend.
-
The closest parallel is AI bookkeeping and AP automation more broadly. Truewind describes the same shift, reading invoices and contracts automatically so humans just verify edge cases. In that model, margins move away from labor heavy review toward software, and the winning product is the one that captures the workflow after extraction, not just the raw text.
Going forward, this pushes spend management toward a platform battle where document extraction is table stakes and proprietary customer data becomes the moat. As models keep getting cheaper, the advantage shifts to companies like Ramp that sit inside the payment flow, see the contract, the invoice, and the card swipe together, and can turn that full context into automated decisions.