SQL Keywords

SQL Keywords

DATE_TRUNC / TRUNC

Feb 23, 2026

·

5

min read

Category: Date/Time Functions

Platform Support:

✅ Snowflake (DATE_TRUNC) | ✅ BigQuery (DATE_TRUNC, TIMESTAMP_TRUNC) | ✅ Databricks (DATE_TRUNC)

Description

Truncates a date or timestamp to a specified precision level (year, month, week, day, hour, etc.). This is essential for time-based grouping, aggregations, and date comparisons. It "rounds down" dates to the beginning of the specified time unit.

Syntax by Platform

Snowflake:

DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND
DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND
DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND

BigQuery:

DATE_TRUNC(date_expression, date_part)
TIMESTAMP_TRUNC(timestamp_expression, timestamp_part)
-- Parts: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND
DATE_TRUNC(date_expression, date_part)
TIMESTAMP_TRUNC(timestamp_expression, timestamp_part)
-- Parts: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND
DATE_TRUNC(date_expression, date_part)
TIMESTAMP_TRUNC(timestamp_expression, timestamp_part)
-- Parts: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND

Databricks:

DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND, MILLISECOND, MICROSECOND
DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND, MILLISECOND, MICROSECOND
DATE_TRUNC(unit, date_or_timestamp)
-- Units: YEAR, QUARTER, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND, MILLISECOND, MICROSECOND

Platform-Specific Notes

Snowflake:

  • Single function for both dates and timestamps

  • Week starts on Monday by default

  • Can use WEEK parameter for ISO week

  • Supports negative time travel with result

BigQuery:

  • Separate functions: DATE_TRUNC for dates, TIMESTAMP_TRUNC for timestamps

  • Week starts on Sunday by default

  • Can specify WEEK(MONDAY) for different start day

  • Time zone aware with TIMESTAMP

Databricks:

  • Works with DATE and TIMESTAMP types

  • Supports millisecond and microsecond precision

  • Week starts on Monday by default

  • Compatible with Spark SQL

Example 1: Group Sales by Month

Snowflake:

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

BigQuery:

SELECT 
    DATE_TRUNC(order_date, MONTH) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC(order_date, MONTH)
ORDER BY month

SELECT 
    DATE_TRUNC(order_date, MONTH) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC(order_date, MONTH)
ORDER BY month

SELECT 
    DATE_TRUNC(order_date, MONTH) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC(order_date, MONTH)
ORDER BY month

Databricks:

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

SELECT 
    DATE_TRUNC('MONTH', order_date) as month,
    COUNT(*) as order_count,
    SUM(order_total) as monthly_revenue
FROM orders
GROUP BY DATE_TRUNC('MONTH', order_date)
ORDER BY month

Sample Data (orders table):

order_id

order_date

order_total

101

2024-01-05

250.00

102

2024-01-15

180.00

103

2024-01-28

320.00

104

2024-02-03

450.00

105

2024-02-14

290.00

106

2024-03-07

510.00

107

2024-03-22

380.00

Result (all platforms):

month

order_count

monthly_revenue

2024-01-01

3

750.00

2024-02-01

2

740.00

2024-03-01

2

890.00

Example 2: Multiple Truncation Levels

Snowflake/Databricks:

SELECT 
    event_timestamp,
    DATE_TRUNC('YEAR', event_timestamp) as year_start,
    DATE_TRUNC('QUARTER', event_timestamp) as quarter_start,
    DATE_TRUNC('MONTH', event_timestamp) as month_start,
    DATE_TRUNC('WEEK', event_timestamp) as week_start,
    DATE_TRUNC('DAY', event_timestamp) as day_start,
    DATE_TRUNC('HOUR', event_timestamp) as hour_start
FROM events
LIMIT 1

SELECT 
    event_timestamp,
    DATE_TRUNC('YEAR', event_timestamp) as year_start,
    DATE_TRUNC('QUARTER', event_timestamp) as quarter_start,
    DATE_TRUNC('MONTH', event_timestamp) as month_start,
    DATE_TRUNC('WEEK', event_timestamp) as week_start,
    DATE_TRUNC('DAY', event_timestamp) as day_start,
    DATE_TRUNC('HOUR', event_timestamp) as hour_start
FROM events
LIMIT 1

SELECT 
    event_timestamp,
    DATE_TRUNC('YEAR', event_timestamp) as year_start,
    DATE_TRUNC('QUARTER', event_timestamp) as quarter_start,
    DATE_TRUNC('MONTH', event_timestamp) as month_start,
    DATE_TRUNC('WEEK', event_timestamp) as week_start,
    DATE_TRUNC('DAY', event_timestamp) as day_start,
    DATE_TRUNC('HOUR', event_timestamp) as hour_start
FROM events
LIMIT 1

BigQuery:

SELECT 
    event_timestamp,
    TIMESTAMP_TRUNC(event_timestamp, YEAR) as year_start,
    TIMESTAMP_TRUNC(event_timestamp, QUARTER) as quarter_start,
    TIMESTAMP_TRUNC(event_timestamp, MONTH) as month_start,
    TIMESTAMP_TRUNC(event_timestamp, WEEK) as week_start,
    TIMESTAMP_TRUNC(event_timestamp, DAY) as day_start,
    TIMESTAMP_TRUNC(event_timestamp, HOUR) as hour_start
FROM events
LIMIT 1

SELECT 
    event_timestamp,
    TIMESTAMP_TRUNC(event_timestamp, YEAR) as year_start,
    TIMESTAMP_TRUNC(event_timestamp, QUARTER) as quarter_start,
    TIMESTAMP_TRUNC(event_timestamp, MONTH) as month_start,
    TIMESTAMP_TRUNC(event_timestamp, WEEK) as week_start,
    TIMESTAMP_TRUNC(event_timestamp, DAY) as day_start,
    TIMESTAMP_TRUNC(event_timestamp, HOUR) as hour_start
FROM events
LIMIT 1

SELECT 
    event_timestamp,
    TIMESTAMP_TRUNC(event_timestamp, YEAR) as year_start,
    TIMESTAMP_TRUNC(event_timestamp, QUARTER) as quarter_start,
    TIMESTAMP_TRUNC(event_timestamp, MONTH) as month_start,
    TIMESTAMP_TRUNC(event_timestamp, WEEK) as week_start,
    TIMESTAMP_TRUNC(event_timestamp, DAY) as day_start,
    TIMESTAMP_TRUNC(event_timestamp, HOUR) as hour_start
FROM events
LIMIT 1

Sample Input:

event_timestamp: 2024-03-15 14:35:42
event_timestamp: 2024-03-15 14:35:42
event_timestamp: 2024-03-15 14:35:42

Result:

Truncation

Result

year_start

2024-01-01 00:00:00

quarter_start

2024-01-01 00:00:00

month_start

2024-03-01 00:00:00

week_start

2024-03-11 00:00:00 (Monday)

day_start

2024-03-15 00:00:00

hour_start

2024-03-15 14:00:00

Example 3: Active Users by Week

All Platforms (Snowflake/Databricks syntax):

SELECT 
    DATE_TRUNC('WEEK', login_timestamp) as week_start,
    COUNT(DISTINCT user_id) as weekly_active_users,
    COUNT(*) as total_logins
FROM user_logins
WHERE login_timestamp >= '2024-01-01'
GROUP BY DATE_TRUNC('WEEK', login_timestamp)
ORDER BY

SELECT 
    DATE_TRUNC('WEEK', login_timestamp) as week_start,
    COUNT(DISTINCT user_id) as weekly_active_users,
    COUNT(*) as total_logins
FROM user_logins
WHERE login_timestamp >= '2024-01-01'
GROUP BY DATE_TRUNC('WEEK', login_timestamp)
ORDER BY

SELECT 
    DATE_TRUNC('WEEK', login_timestamp) as week_start,
    COUNT(DISTINCT user_id) as weekly_active_users,
    COUNT(*) as total_logins
FROM user_logins
WHERE login_timestamp >= '2024-01-01'
GROUP BY DATE_TRUNC('WEEK', login_timestamp)
ORDER BY

Sample Data (user_logins table):

user_id

login_timestamp

101

2024-01-02 09:15:00

102

2024-01-02 10:30:00

101

2024-01-03 08:45:00

103

2024-01-05 14:20:00

101

2024-01-08 09:00:00

102

2024-01-10 11:15:00

104

2024-01-10 13:30:00

Result:

week_start

weekly_active_users

total_logins

2024-01-01

3

4

2024-01-08

3

3

Example 4: Hourly Event Distribution

Snowflake/Databricks:

SELECT 
    DATE_TRUNC('HOUR', event_time) as hour_bucket,
    event_type,
    COUNT(*) as event_count,
    COUNT(DISTINCT user_id) as unique_users
FROM events
WHERE DATE(event_time) = '2024-03-15'
GROUP BY DATE_TRUNC('HOUR', event_time), event_type
ORDER BY hour_bucket,

SELECT 
    DATE_TRUNC('HOUR', event_time) as hour_bucket,
    event_type,
    COUNT(*) as event_count,
    COUNT(DISTINCT user_id) as unique_users
FROM events
WHERE DATE(event_time) = '2024-03-15'
GROUP BY DATE_TRUNC('HOUR', event_time), event_type
ORDER BY hour_bucket,

SELECT 
    DATE_TRUNC('HOUR', event_time) as hour_bucket,
    event_type,
    COUNT(*) as event_count,
    COUNT(DISTINCT user_id) as unique_users
FROM events
WHERE DATE(event_time) = '2024-03-15'
GROUP BY DATE_TRUNC('HOUR', event_time), event_type
ORDER BY hour_bucket,

Sample Data (events table for 2024-03-15):

event_time

event_type

user_id

2024-03-15 09:15:22

page_view

101

2024-03-15 09:23:45

click

101

2024-03-15 09:45:10

page_view

102

2024-03-15 10:05:33

click

103

2024-03-15 10:30:18

purchase

101

2024-03-15 10:55:42

page_view

102

Result:

hour_bucket

event_type

event_count

unique_users

2024-03-15 09:00:00

click

1

1

2024-03-15 09:00:00

page_view

2

2

2024-03-15 10:00:00

click

1

1

2024-03-15 10:00:00

page_view

1

1

2024-03-15 10:00:00

purchase

1

1

Truncation Units Reference

Unit

Truncates To

Example: 2024-03-15 14:35:42

YEAR

January 1

2024-01-01 00:00:00

QUARTER

First day of quarter

2024-01-01 00:00:00

MONTH

First day of month

2024-03-01 00:00:00

WEEK

Monday of week

2024-03-11 00:00:00

DAY

Midnight

2024-03-15 00:00:00

HOUR

Start of hour

2024-03-15 14:00:00

MINUTE

Start of minute

2024-03-15 14:35:00

SECOND

Start of second

2024-03-15 14:35:42

Common Use Cases

  • Time-series aggregation: Group sales by day/week/month

  • Cohort analysis: Group users by signup month

  • Report periods: Define reporting boundaries

  • Date comparisons: Compare same period across different times

  • Data partitioning: Organize data by time buckets

  • Deduplication: Keep one record per time period

Best Practices

  1. Use in GROUP BY: Essential for time-based aggregations

  2. Index truncated columns: Consider materialized columns for frequent queries

  3. Consistent units: Use same truncation level throughout query

  4. Time zones: Be aware of time zone handling in BigQuery

  5. Week start day: Know your platform's default (Monday vs Sunday)

  6. Combine with date functions: Often used with DATE_ADD, DATEDIFF

Related Functions

  • DATEADD / DATE_ADD - Add interval to date

  • DATEDIFF / DATE_DIFF - Calculate difference between dates

  • EXTRACT - Get specific date part

  • DATE_PART - Similar to EXTRACT

  • LAST_DAY - Get last day of month

Interested to Learn More?
Try Out the Free 14-Days Trial

More Articles

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.