dbt™ Semantic Layer vs. Snowflake Semantic Views: A Complete Technical Comparison

Where should governed metrics live — in your dbt™ project or in Snowflake itself? With Snowflake Semantic Views now GA and the dbt™ Semantic Layer expanding fast, analytics teams finally have a real architectural choice — and a hybrid path that gives you both.

Fabio Di Leta

·

Mar 18, 2026

·

6

min read

dbt™ Semantic Layer vs. Snowflake Semantic Views: A Complete Technical Comparison

If your team runs on Snowflake and debates where to define governed metrics — in dbt™ YAML or Snowflake DDL — you now have a third option that combines both. Snowflake Semantic Views reached GA in mid-2025, with standard SQL querying going GA on March 2, 2026. The dbt™ Semantic Layer, powered by the now open-source MetricFlow engine, continues expanding integrations and recently overhauled its YAML spec (January 2026). Meanwhile, a hybrid approach using the dbt_semantic_view package lets teams define Snowflake Semantic Views as dbt™ models — version-controlled, lineage-tracked, and deployed via CI/CD. This post breaks down all three approaches with the depth analytics engineers need and the strategic context data leaders want.

How the dbt™ Semantic Layer Actually Works Under the Hood

The dbt™ Semantic Layer is a centralized metrics framework that lets teams define business logic once in YAML and query it from any integrated tool. Its engine, MetricFlow, was built by Transform (acquired by dbt™ Labs in February 2023), went GA at Coalesce 2023, and was open-sourced under Apache 2.0 in October 2025 — a major milestone for the ecosystem.

The architecture has four layers. MetricFlow and the dbt™ Semantic Interfaces (both open source) handle metric definitions and SQL compilation. The proprietary Service Layer and Semantic Layer APIs (JDBC via Arrow Flight SQL plus GraphQL) live in dbt™ Cloud and coordinate query execution. When a BI tool requests a metric, MetricFlow builds a semantic graph from your YAML definitions, resolves optimal join paths between tables via entity relationships, and compiles warehouse-specific SQL. The generated query executes directly against your data platform; results flow back through the API gateway. Data passes through dbt™ Labs servers ephemerally (encrypted, never persisted).

Semantic models map one-to-one with dbt™ models, declaring three building blocks:

  • Entities — primary, foreign, unique, or natural keys

  • Dimensions — categorical or time-based attributes

  • Measures — aggregations like sum, count_distinct, avg, median

Metrics then reference measures with five supported types: simple, derived, ratio, cumulative, and conversion.

Here's how an e-commerce orders model looks in the legacy YAML spec:

semantic_models:
  - name: order_items
    model: ref('order_items')
    defaults:
      agg_time_dimension: ordered_at
    entities:
      - name: order_item
        type: primary
        expr: order_item_id
      - name: order_id
        type: foreign
      - name: customer
        type: foreign
    measures:
      - name: revenue
        agg: sum
        expr: product_price
      - name: order_count
        agg: count_distinct
        expr: order_id
    dimensions:
      - name: ordered_at
        type: time
        type_params:
          time_granularity: day
      - name: channel
        type

semantic_models:
  - name: order_items
    model: ref('order_items')
    defaults:
      agg_time_dimension: ordered_at
    entities:
      - name: order_item
        type: primary
        expr: order_item_id
      - name: order_id
        type: foreign
      - name: customer
        type: foreign
    measures:
      - name: revenue
        agg: sum
        expr: product_price
      - name: order_count
        agg: count_distinct
        expr: order_id
    dimensions:
      - name: ordered_at
        type: time
        type_params:
          time_granularity: day
      - name: channel
        type

semantic_models:
  - name: order_items
    model: ref('order_items')
    defaults:
      agg_time_dimension: ordered_at
    entities:
      - name: order_item
        type: primary
        expr: order_item_id
      - name: order_id
        type: foreign
      - name: customer
        type: foreign
    measures:
      - name: revenue
        agg: sum
        expr: product_price
      - name: order_count
        agg: count_distinct
        expr: order_id
    dimensions:
      - name: ordered_at
        type: time
        type_params:
          time_granularity: day
      - name: channel
        type

Metrics are then defined separately:

metrics:
  - name: revenue
    type: simple
    type_params:
      measure: revenue
  - name: average_order_value
    type: derived
    type_params:
      expr: revenue / order_count
      metrics:
        - name: revenue
        - name

metrics:
  - name: revenue
    type: simple
    type_params:
      measure: revenue
  - name: average_order_value
    type: derived
    type_params:
      expr: revenue / order_count
      metrics:
        - name: revenue
        - name

metrics:
  - name: revenue
    type: simple
    type_params:
      measure: revenue
  - name: average_order_value
    type: derived
    type_params:
      expr: revenue / order_count
      metrics:
        - name: revenue
        - name

The modernized spec (January 2026, available now on the Fusion engine and coming to dbt™ Core™ v1.12) collapses this structure significantly. Measures fold into metrics, semantic annotations embed directly in models: YAML entries, and the separate semantic_models: top-level key disappears. A migration tool (dbt-autofix deprecations --semantic-layer) handles the transition.

Querying happens through multiple channels. The CLI (dbt sl query --metrics revenue --group-by metric_time__month) works for development. The JDBC API (Arrow Flight SQL) and GraphQL API serve downstream tools. Native BI integrations include Tableau (GA, via Tableau Exchange connector), Google Sheets, Microsoft Excel, Hex, Mode, Klipfolio PowerMetrics, and Push.ai. Power BI entered private beta in March 2025. Any non-integrated tool can use dbt™'s "exports" feature to materialize metrics as physical tables or views.

The Semantic Layer supports Snowflake, BigQuery, Databricks, Redshift, Postgres, and Trino — making it the only multi-warehouse option in this comparison.

Key constraints

  • The full Semantic Layer (APIs, integrations, caching) requires dbt™ Cloud on a Starter plan ($100/dev seat/month) or higher. dbt™ Core™ users can define metrics and query locally via the mf CLI, but cannot access the hosted APIs or integrations.

  • Fan-out joins are rejected to prevent row duplication.

  • Measure names must be globally unique across all semantic models.

  • The modernized YAML spec is live on Fusion but the legacy spec still works — migration is opt-in.

Snowflake Semantic Views: Metrics as Native Database Objects

Snowflake Semantic Views take a fundamentally different approach: they are schema-level database objects that store business semantics (metrics, dimensions, facts, relationships) directly in Snowflake's metadata catalog. Unlike regular views, they don't contain a query definition — they declaratively describe the business meaning of your physical tables, and Snowflake's query engine resolves the SQL at query time.

GA timeline

Milestone

Date

Preview launch

April 17, 2025

Core DDL generally available

Snowflake Summit, June 2025

Derived metrics GA

September 30, 2025

Standard SQL querying GA

March 2, 2026

Semi-additive metrics (NON ADDITIVE BY)

Preview, March 5, 2026

Tableau TDS export

Preview, March 5, 2026

Semantic View Autopilot

GA, March 2026

Semantic Views are available to all Snowflake editions (Standard and above) at no additional cost.

Defining a Semantic View

The DDL syntax uses five declarative blocks — TABLES, RELATIONSHIPS, FACTS, DIMENSIONS, and METRICS — in that exact order:

CREATE OR REPLACE SEMANTIC VIEW ecommerce_analysis

  TABLES (
    customers AS analytics.ecommerce.dim_customers
      PRIMARY KEY (customer_id),
    orders AS analytics.ecommerce.fct_orders
      PRIMARY KEY (order_id),
    order_items AS analytics.ecommerce.fct_order_items
      PRIMARY KEY (order_item_id)
  )

  RELATIONSHIPS (
    orders(customer_id) REFERENCES customers,
    order_items(order_id) REFERENCES orders
  )

  FACTS (
    order_items.product_price AS product_price,
    orders.order_total AS o_totalprice
  )

  DIMENSIONS (
    customers.customer_name AS c_name
      WITH SYNONYMS = ('buyer', 'purchaser')
      COMMENT = 'Full customer name',
    customers.signup_channel AS c_channel,
    orders.order_date AS o_orderdate
  )

  METRICS (
    orders.total_revenue AS SUM(orders.order_total)
      COMMENT = 'Gross revenue across all orders',
    orders.order_count AS COUNT(order_id),
    orders.avg_order_value AS AVG(orders.order_total),
    -- Derived metric (view-scoped, no table prefix):
    revenue_per_customer AS orders.total_revenue / customers.customer_count
  )

CREATE OR REPLACE SEMANTIC VIEW ecommerce_analysis

  TABLES (
    customers AS analytics.ecommerce.dim_customers
      PRIMARY KEY (customer_id),
    orders AS analytics.ecommerce.fct_orders
      PRIMARY KEY (order_id),
    order_items AS analytics.ecommerce.fct_order_items
      PRIMARY KEY (order_item_id)
  )

  RELATIONSHIPS (
    orders(customer_id) REFERENCES customers,
    order_items(order_id) REFERENCES orders
  )

  FACTS (
    order_items.product_price AS product_price,
    orders.order_total AS o_totalprice
  )

  DIMENSIONS (
    customers.customer_name AS c_name
      WITH SYNONYMS = ('buyer', 'purchaser')
      COMMENT = 'Full customer name',
    customers.signup_channel AS c_channel,
    orders.order_date AS o_orderdate
  )

  METRICS (
    orders.total_revenue AS SUM(orders.order_total)
      COMMENT = 'Gross revenue across all orders',
    orders.order_count AS COUNT(order_id),
    orders.avg_order_value AS AVG(orders.order_total),
    -- Derived metric (view-scoped, no table prefix):
    revenue_per_customer AS orders.total_revenue / customers.customer_count
  )

CREATE OR REPLACE SEMANTIC VIEW ecommerce_analysis

  TABLES (
    customers AS analytics.ecommerce.dim_customers
      PRIMARY KEY (customer_id),
    orders AS analytics.ecommerce.fct_orders
      PRIMARY KEY (order_id),
    order_items AS analytics.ecommerce.fct_order_items
      PRIMARY KEY (order_item_id)
  )

  RELATIONSHIPS (
    orders(customer_id) REFERENCES customers,
    order_items(order_id) REFERENCES orders
  )

  FACTS (
    order_items.product_price AS product_price,
    orders.order_total AS o_totalprice
  )

  DIMENSIONS (
    customers.customer_name AS c_name
      WITH SYNONYMS = ('buyer', 'purchaser')
      COMMENT = 'Full customer name',
    customers.signup_channel AS c_channel,
    orders.order_date AS o_orderdate
  )

  METRICS (
    orders.total_revenue AS SUM(orders.order_total)
      COMMENT = 'Gross revenue across all orders',
    orders.order_count AS COUNT(order_id),
    orders.avg_order_value AS AVG(orders.order_total),
    -- Derived metric (view-scoped, no table prefix):
    revenue_per_customer AS orders.total_revenue / customers.customer_count
  )

Querying Semantic Views

Two query methods are now GA. The original SEMANTIC_VIEW() clause:

SELECT * FROM SEMANTIC_VIEW(
  ecommerce_analysis
  METRICS orders.total_revenue, orders.order_count
  DIMENSIONS customers.signup_channel
  WHERE orders.order_date >= '2025-01-01'
)

SELECT * FROM SEMANTIC_VIEW(
  ecommerce_analysis
  METRICS orders.total_revenue, orders.order_count
  DIMENSIONS customers.signup_channel
  WHERE orders.order_date >= '2025-01-01'
)

SELECT * FROM SEMANTIC_VIEW(
  ecommerce_analysis
  METRICS orders.total_revenue, orders.order_count
  DIMENSIONS customers.signup_channel
  WHERE orders.order_date >= '2025-01-01'
)

The newer standard SQL approach (GA March 2, 2026):

SELECT
  signup_channel,
  AGG(total_revenue),
  AGG(order_count)
FROM ecommerce_analysis
GROUP BY signup_channel
HAVING AGG(total_revenue) > 10000

SELECT
  signup_channel,
  AGG(total_revenue),
  AGG(order_count)
FROM ecommerce_analysis
GROUP BY signup_channel
HAVING AGG(total_revenue) > 10000

SELECT
  signup_channel,
  AGG(total_revenue),
  AGG(order_count)
FROM ecommerce_analysis
GROUP BY signup_channel
HAVING AGG(total_revenue) > 10000

The standard SQL approach is the real unlock for BI compatibility. Any SQL-capable tool — Sigma, Tableau, Power BI, Mode, Hex, or a Python notebook — can query semantic views without special connectors.

Key limitations

  • No FILTERS clause in the DDL — pre-defined row-level filters exist only in the YAML spec, not SQL syntax.

  • Derived metrics are view-scoped (no table alias prefix) and cannot be referenced by regular metrics.

  • You cannot mix FACTS and METRICS in the same SEMANTIC_VIEW() clause.

  • Semantic views cannot be replicated across Snowflake accounts through standard replication.

  • The Cortex Analyst integration imposes a 2 MB size limit on the semantic model definition.

  • Snowflake recommends no more than 10 tables and 50 columns per view for performance.

Cortex Analyst integration

Semantic views are the recommended foundation for Snowflake's natural-language querying. Cortex Analyst reads the semantic view definition — including synonyms, descriptions, and verified query examples — and generates correct SQL. The AI_SQL_GENERATION and AI_QUESTION_CATEGORIZATION parameters in the DDL let teams provide custom instructions to the underlying LLM. Snowflake reports approximately 85–90% query accuracy with semantic views vs. ~40% without.

The Hybrid Path: Managing Snowflake Semantic Views Through dbt™

The dbt_semantic_view package from Snowflake Labs introduces a semantic_view materialization type that bridges both worlds. Define Snowflake Semantic Views as .sql files in your dbt™ project, deploy them with dbt run, and track them in the dbt™ DAG — the resulting object is a fully native Snowflake Semantic View queryable via standard SQL.

Installation

# packages.yml
packages:
  - package

# packages.yml
packages:
  - package

# packages.yml
packages:
  - package

Then run dbt deps.

Model definition

-- models/semantic/ecommerce_metrics.sql
{{ config(materialized='semantic_view') }}

TABLES (
  customers AS {{ ref('dim_customers') }}
    PRIMARY KEY (customer_id)
    WITH SYNONYMS = ('buyers', 'accounts')
    COMMENT = 'Customer dimension table',
  orders AS {{ ref('fct_orders') }}
    PRIMARY KEY (order_id)
    COMMENT = 'Order fact table'
)
RELATIONSHIPS (
  orders(customer_id) REFERENCES customers
)
DIMENSIONS (
  customers.signup_channel AS c_channel,
  orders.order_date AS o_orderdate
)
METRICS (
  orders.total_revenue AS SUM(orders.order_total),
  orders.order_count AS COUNT(order_id)
)
-- models/semantic/ecommerce_metrics.sql
{{ config(materialized='semantic_view') }}

TABLES (
  customers AS {{ ref('dim_customers') }}
    PRIMARY KEY (customer_id)
    WITH SYNONYMS = ('buyers', 'accounts')
    COMMENT = 'Customer dimension table',
  orders AS {{ ref('fct_orders') }}
    PRIMARY KEY (order_id)
    COMMENT = 'Order fact table'
)
RELATIONSHIPS (
  orders(customer_id) REFERENCES customers
)
DIMENSIONS (
  customers.signup_channel AS c_channel,
  orders.order_date AS o_orderdate
)
METRICS (
  orders.total_revenue AS SUM(orders.order_total),
  orders.order_count AS COUNT(order_id)
)
-- models/semantic/ecommerce_metrics.sql
{{ config(materialized='semantic_view') }}

TABLES (
  customers AS {{ ref('dim_customers') }}
    PRIMARY KEY (customer_id)
    WITH SYNONYMS = ('buyers', 'accounts')
    COMMENT = 'Customer dimension table',
  orders AS {{ ref('fct_orders') }}
    PRIMARY KEY (order_id)
    COMMENT = 'Order fact table'
)
RELATIONSHIPS (
  orders(customer_id) REFERENCES customers
)
DIMENSIONS (
  customers.signup_channel AS c_channel,
  orders.order_date AS o_orderdate
)
METRICS (
  orders.total_revenue AS SUM(orders.order_total),
  orders.order_count AS COUNT(order_id)
)

The {{ ref() }} and {{ source() }} calls resolve table references and register the semantic view as a node in the dbt™ DAG. When dbt run executes, the package wraps the body in CREATE OR REPLACE SEMANTIC VIEW DDL. The result is a native Snowflake Semantic View — queryable via SEMANTIC_VIEW(), standard SQL with AGG(), or Cortex Analyst — defined in version-controlled code with full CI/CD, PR review, and lineage visibility.

Package caveats

  • persist_docs does not work — documentation must use inline COMMENT = '...' syntax in the DDL.

  • It is a Snowflake Labs community package, not native to the dbt-snowflake adapter (an open feature request exists at dbt-labs/dbt-adapters#1260).

  • For dbt™ Fusion users, set static_analysis: off for semantic view files.

  • Current version: v1.0.3 (released October 2025, Apache 2.0 license).

Key Technical Differences

The differences between dbt™ Semantic Layer and Snowflake Semantic Views go deeper than syntax.

Query architecture

The dbt™ Semantic Layer operates as a middleware proxy — every metric query routes through dbt™ Cloud, where MetricFlow compiles it to SQL, executes it against your warehouse, and returns results. This adds a network hop and latency but provides a uniform API regardless of warehouse.

Snowflake Semantic Views execute entirely within Snowflake's query engine with zero external dependencies. Any SQL client works immediately; no special drivers or tokens are needed.

Relationship handling

Snowflake uses explicit DDL relationships — you declare orders(customer_id) REFERENCES customers and Snowflake resolves join paths. MetricFlow uses entity-based implicit joins — you declare entity types (primary, foreign) and the engine infers optimal join paths through a semantic graph. MetricFlow handles multi-hop joins automatically; Snowflake's explicit declarations give more direct control.

Metric type coverage

Metric type

dbt™ Semantic Layer

Snowflake Semantic Views

Simple aggregations

Derived metrics

✅ (GA Sep 2025)

Ratio metrics

✅ (via derived)

Cumulative / running totals

✅ (native type)

⚠️ (manual window functions)

Conversion funnel metrics

✅ (native type)

Semi-additive metrics

⚠️ (workaround)

✅ (Preview, Mar 2026)

Window function metrics

⚠️ (workaround)

Automatic time grains

✅ (date__month, date__year)

❌ (manual dimensions)

Full comparison

Dimension

dbt™ Semantic Layer

Snowflake Semantic Views

Query interface

MetricFlow API (JDBC/GraphQL) via dbt™ Cloud

Native SQL (AGG() or SEMANTIC_VIEW())

Warehouse support

Snowflake, BigQuery, Databricks, Redshift, Postgres, Trino

Snowflake only

Infrastructure cost

dbt™ Cloud required (Starter+, ~$100/seat/mo)

Zero additional cost (included with Snowflake)

BI tool access

Native integrations or JDBC

Any SQL tool — no connectors needed

Relationship model

Entity-based implicit joins

Explicit DDL foreign key references

Cumulative / conversion metrics

Native types

Manual (window functions / complex SQL)

AI / NL querying

dbt™ MCP Server, GraphQL for agents

Cortex Analyst, Cortex Agents, Autopilot

Version control

Native (YAML in Git)

Via hybrid dbt™ package or manual YAML export

GA status

GA since October 2023

Core DDL GA mid-2025; SQL querying GA March 2026

When to Use Which

Choose the dbt™ Semantic Layer if…

  • Your data platform spans multiple warehouses (e.g., Snowflake + BigQuery or Databricks).

  • You need conversion funnel or cumulative metrics as first-class types.

  • Your BI stack includes tools with native dbt™ Semantic Layer integrations (Tableau via Exchange, Hex, Mode, Google Sheets).

  • You want automatic time grain shortcuts across all metrics without defining extra dimensions.

  • Metric governance and lifecycle management (versioning, deprecation, changelog) via Git is a priority.

Choose Snowflake Semantic Views if…

  • Your data lives exclusively in Snowflake and you want zero external dependencies.

  • You're investing in Cortex Analyst and Snowflake Intelligence — semantic views are the native context layer.

  • You need every SQL client and BI tool to work immediately without special connectors or proxy setup.

  • You want a no-cost semantic layer included with your existing Snowflake contract.

  • Your team works SQL-first and prefers DDL to YAML.

Choose the hybrid (dbt_semantic_view) if…

  • You run dbt™ on Snowflake (e.g., via a platform like Paradime) and want Git-based governance over native Snowflake objects.

  • You want dbt run, dbt test, PR-based review, and CI/CD — but the resulting semantic objects need to be queryable by any SQL tool without a proxy.

  • You're building for Cortex Analyst but need traceability back to your dbt™ data models.

Conclusion

The semantic layer landscape has shifted from a single-vendor story to a genuine architectural choice. The dbt™ Semantic Layer offers warehouse portability and the richest metric type system — conversion funnels, cumulative windows, automatic time grains — at the cost of a dbt™ Cloud dependency and proxy-based query routing. Snowflake Semantic Views deliver zero-infrastructure simplicity and universal SQL access, with the March 2026 GA of standard SQL querying removing the last major adoption barrier. The dbt_semantic_view package creates a genuinely useful middle path that preserves dbt™ governance while producing native Snowflake objects.

The emerging Open Semantic Interchange (OSI) initiative — co-founded by dbt™ Labs, Snowflake, and Salesforce — signals that these two approaches may converge toward a vendor-neutral standard. Until then, the choice is practical: multi-warehouse teams lean dbt™, Snowflake-native teams lean Semantic Views, and teams wanting both governance and native queryability should explore the hybrid package today.

Further reading

Interested to Learn More?
Try Out the Free 14-Days Trial

More Articles

decorative icon

Future of Data Work
Available Today

decorative icon

Future of Data Work
Available Today

decorative icon

Future of Data Work
Available Today

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.