Semantic Layer Centralizes BI Metrics

Diving deeper into

George Xing, co-founder and CEO of Supergrain, on the future of business intelligence

Interview
these monolithic stacks have become unbundled
Analyzed 7 sources

Unbundling shifted power in BI from the tool that shows charts to the layer that defines the numbers. Older systems bundled storage, transformation, metric logic, and dashboards in one product. Modern stacks split those jobs across a warehouse like Snowflake or BigQuery, transformation tools, and front ends like Looker or Tableau. That made analytics more flexible, but it also created a new problem, because revenue or conversion can now be calculated differently in each surface unless one shared semantic layer sits above them all.

  • In the old model, one vendor owned the whole workflow, from storing data to drawing the chart. In the new model, BI tools query data that already lives in the warehouse, which is why Looker is documented as working on top of BigQuery and other analytical stores rather than replacing them.
  • The practical tradeoff is flexibility versus consistency. Teams can use notebooks, reverse ETL, planning tools, and dashboards against the same warehouse, but each tool can carry its own SQL logic. That is how finance, product, and growth teams end up with different answers to the same basic metric.
  • The market response has been to rebuild the missing middle layer. Looker has been opening its semantic layer to Tableau and other tools, while Snowflake now offers semantic models and semantic views. That shows where value is moving, toward the system that defines metrics once and lets many apps consume them.

The next step is a stack where the warehouse remains the data home, but metric definitions become a separate control point shared across BI, AI, planning, and operational tools. Companies that own that semantic layer will shape how data is consumed across the organization, even if they do not own the dashboard where the number appears.