Federated Models as No-Code Moat
Thilo Huellmann, CTO of Levity, on using no-code AI for workflow automation
The strategic point is that shared model improvement can become the moat for a no-code AI platform serving fragmented industries. Levity is describing a setup where each small operator keeps its own raw emails, PDFs, or images, while the platform aggregates learning across many similar workflows, so a 200 person freight broker can benefit from patterns seen across dozens of peers instead of relying only on its own limited history.
-
This matters most in messy classification jobs. Levity is built for unstructured inputs like customer emails, scanned PDFs, and images, where each small company may only have a thin training set. Combining learning across many customers can make the model better at edge cases because the examples come from different senders, formats, and writing styles.
-
The technical pattern behind this is federated learning. In federated systems, participants keep training data local and send model updates or aggregated signals back to a shared system, which lets a global model improve without centralizing every companys raw records. That is the concrete meaning of pooling data without direct data access.
-
If Levity can make this work, retention rises for a very practical reason. The customer is not just buying a workflow tool for routing Gmail messages or classifying documents. It is plugging into a network whose models improve as more similar companies use it, which is a stronger lock in than ordinary automation software.
The next step is that no-code AI vendors will try to turn many small customer datasets into one shared performance engine. In verticals like logistics, that shifts the advantage away from whoever has the biggest internal dataset and toward whoever builds the best privacy preserving collaboration layer and embeds it deepest into daily workflows.