dbt's path to multi-cloud workflows
Tristan Handy, CEO of dbt Labs, on dbt’s multi-cloud tailwinds
This marks a shift from forcing monetization through migration to growing by becoming the shared workflow layer for mixed teams. In practice, large companies already have analysts in dbt Cloud for browser based development and governance, while engineers keep parts of dbt Core in Git and local tooling. The win is not getting every seat onto one SKU at once. The win is making those teams ship one project, one set of tests, and one set of business definitions without friction.
-
dbt Core is the open source framework where teams write SQL, tests, and metadata. dbt Cloud sells the surrounding workflow, IDE, CI, scheduler, docs hosting, semantic layer, and governance. That makes Cloud easiest to buy when a company wants reliability and collaboration, but it also means some users will rationally stay on Core.
-
This is especially true in enterprises, where one data org can have many teams and more than one warehouse. dbt has described expansion as starting with one team, then spreading across other teams and even across Snowflake and Databricks inside the same company. A forced full Cloud migration would fight how these orgs actually work.
-
The broader product direction makes the bridge more valuable. As dbt adds orchestration, observability, cataloging, and a control plane above the warehouse, it needs Core users and Cloud users to share metadata and workflows cleanly. Multi cloud adoption and Iceberg support strengthen that layer above any single warehouse, instead of pushing customers deeper into one closed stack.
Going forward, dbt is likely to grow by turning mixed Core and Cloud deployments into a smooth upgrade path, not a hard switch. If it becomes the place where teams manage shared logic, schedules, lineage, and quality across Snowflake, Databricks, and Iceberg based data, Cloud becomes the operating layer companies add around Core, rather than a replacement they must swallow all at once.