LLMs Unlock No-Code Document Automation
Thilo Huellmann, CTO of Levity, on using no-code AI for workflow automation
This is the key unlock that can turn Levity from a classifier builder into a tool for automating messy back office decisions. The bottleneck in custom document workflows is usually not model architecture, it is collecting and labeling enough examples for every niche document type. Large language models matter because they can pull structure from rare, weird files with far less setup, which makes long tail use cases economically possible for mid sized teams.
-
Levity already handles the full path from raw files to action. A user can pull old emails, PDFs, or Drive files into Levity, run OCR and preprocessing, train a model, then route the result into the workflow. LLM based extraction fits naturally into that existing pipeline because it removes the most manual step, which is labeling thousands of examples.
-
The competitive gap is between broad automation tools and narrow document AI vendors. Zapier is strong at moving data between apps, but weak at understanding ambiguous content inside a file. Vertical document tools work well for common forms like invoices, but not for hearing test PDFs or lab images that only a few dozen companies care about. That white space is where Levity sits.
-
This also helps explain Levity's infrastructure focus. The product has to support many customer specific models at low price points, often with spiky usage. Levity was already using SageMaker Serverless Inference in 2022 to avoid paying for always on endpoints, and AWS made that product generally available in April 2022, which matched Levity's need to keep custom automation affordable.
The next step is systems that read mixed inputs, text, tables, images, and then make workflow decisions without a custom training project for each task. If that works, no code AI moves from tagging and routing into true operations software for the long tail of industry specific processes that still live in inboxes, PDFs, and spreadsheets.