SQL Keywords

SQL Keywords

CREATE TABLE / ALTER TABLE / DROP TABLE

Feb 23, 2026

·

5

min read

Category: Data Definition Language (DDL)

Platform Support:

✅ Snowflake | ✅ BigQuery | ✅ Databricks

Description

CREATE TABLE defines a new table structure, ALTER TABLE modifies an existing table, and DROP TABLE removes a table. These DDL operations define and manage the schema and structure of database objects.

CREATE TABLE Syntax

CREATE TABLE table_name (
    column1 datatype [constraints],
    column2 datatype [constraints],
    ...
    [table_constraints]
);

-- Create from query
CREATE TABLE new_table AS
SELECT columns
FROM

CREATE TABLE table_name (
    column1 datatype [constraints],
    column2 datatype [constraints],
    ...
    [table_constraints]
);

-- Create from query
CREATE TABLE new_table AS
SELECT columns
FROM

CREATE TABLE table_name (
    column1 datatype [constraints],
    column2 datatype [constraints],
    ...
    [table_constraints]
);

-- Create from query
CREATE TABLE new_table AS
SELECT columns
FROM

ALTER TABLE Syntax

-- Add column
ALTER TABLE table_name
ADD COLUMN column_name datatype;

-- Drop column
ALTER TABLE table_name
DROP COLUMN column_name;

-- Rename column
ALTER TABLE table_name
RENAME COLUMN old_name TO new_name;

-- Modify column
ALTER TABLE table_name
MODIFY COLUMN

-- Add column
ALTER TABLE table_name
ADD COLUMN column_name datatype;

-- Drop column
ALTER TABLE table_name
DROP COLUMN column_name;

-- Rename column
ALTER TABLE table_name
RENAME COLUMN old_name TO new_name;

-- Modify column
ALTER TABLE table_name
MODIFY COLUMN

-- Add column
ALTER TABLE table_name
ADD COLUMN column_name datatype;

-- Drop column
ALTER TABLE table_name
DROP COLUMN column_name;

-- Rename column
ALTER TABLE table_name
RENAME COLUMN old_name TO new_name;

-- Modify column
ALTER TABLE table_name
MODIFY COLUMN

DROP TABLE Syntax

-- Drop table
DROP TABLE table_name;

-- Drop if exists (safer)
DROP TABLE IF EXISTS

-- Drop table
DROP TABLE table_name;

-- Drop if exists (safer)
DROP TABLE IF EXISTS

-- Drop table
DROP TABLE table_name;

-- Drop if exists (safer)
DROP TABLE IF EXISTS

Platform-Specific Notes

Snowflake:

  • Supports transient and temporary tables

  • Time Travel available (can undrop tables)

  • Supports clustering keys

  • Schema evolution with ALTER

  • CLONE for zero-copy cloning

BigQuery:

  • Automatically managed storage

  • Partitioned and clustered tables

  • External tables for data in GCS

  • Table expiration settings

  • Schema changes are atomic

Databricks:

  • Delta Lake tables recommended

  • Supports managed and external tables

  • Schema evolution with MERGE

  • Liquid clustering

  • Unity Catalog for governance

Example 1: CREATE TABLE - Basic

All Platforms:

CREATE TABLE customers (
    customer_id INT PRIMARY KEY,
    first_name VARCHAR(50) NOT NULL,
    last_name VARCHAR(50) NOT NULL,
    email VARCHAR(100) UNIQUE,
    phone VARCHAR(20),
    signup_date DATE DEFAULT CURRENT_DATE,
    status VARCHAR(20) DEFAULT 'Active',
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP
)

CREATE TABLE customers (
    customer_id INT PRIMARY KEY,
    first_name VARCHAR(50) NOT NULL,
    last_name VARCHAR(50) NOT NULL,
    email VARCHAR(100) UNIQUE,
    phone VARCHAR(20),
    signup_date DATE DEFAULT CURRENT_DATE,
    status VARCHAR(20) DEFAULT 'Active',
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP
)

CREATE TABLE customers (
    customer_id INT PRIMARY KEY,
    first_name VARCHAR(50) NOT NULL,
    last_name VARCHAR(50) NOT NULL,
    email VARCHAR(100) UNIQUE,
    phone VARCHAR(20),
    signup_date DATE DEFAULT CURRENT_DATE,
    status VARCHAR(20) DEFAULT 'Active',
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP
)

Result: New table created with specified schema

Example 2: CREATE TABLE with Constraints

All Platforms:

CREATE TABLE orders (
    order_id INT IDENTITY(1,1) PRIMARY KEY,  -- Auto-increment
    customer_id INT NOT NULL,
    order_date DATE NOT NULL DEFAULT CURRENT_DATE,
    total DECIMAL(10,2) NOT NULL CHECK (total >= 0),
    status VARCHAR(20) DEFAULT 'Pending' CHECK (status IN ('Pending', 'Shipped', 'Delivered', 'Cancelled')),
    shipping_address VARCHAR(200),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    
    -- Foreign key constraint
    CONSTRAINT fk_customer
        FOREIGN KEY (customer_id)
        REFERENCES customers(customer_id)
        ON DELETE CASCADE,
    
    -- Check constraint
    CONSTRAINT check_order_date
        CHECK (order_date >= '2020-01-01')
)

CREATE TABLE orders (
    order_id INT IDENTITY(1,1) PRIMARY KEY,  -- Auto-increment
    customer_id INT NOT NULL,
    order_date DATE NOT NULL DEFAULT CURRENT_DATE,
    total DECIMAL(10,2) NOT NULL CHECK (total >= 0),
    status VARCHAR(20) DEFAULT 'Pending' CHECK (status IN ('Pending', 'Shipped', 'Delivered', 'Cancelled')),
    shipping_address VARCHAR(200),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    
    -- Foreign key constraint
    CONSTRAINT fk_customer
        FOREIGN KEY (customer_id)
        REFERENCES customers(customer_id)
        ON DELETE CASCADE,
    
    -- Check constraint
    CONSTRAINT check_order_date
        CHECK (order_date >= '2020-01-01')
)

CREATE TABLE orders (
    order_id INT IDENTITY(1,1) PRIMARY KEY,  -- Auto-increment
    customer_id INT NOT NULL,
    order_date DATE NOT NULL DEFAULT CURRENT_DATE,
    total DECIMAL(10,2) NOT NULL CHECK (total >= 0),
    status VARCHAR(20) DEFAULT 'Pending' CHECK (status IN ('Pending', 'Shipped', 'Delivered', 'Cancelled')),
    shipping_address VARCHAR(200),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    
    -- Foreign key constraint
    CONSTRAINT fk_customer
        FOREIGN KEY (customer_id)
        REFERENCES customers(customer_id)
        ON DELETE CASCADE,
    
    -- Check constraint
    CONSTRAINT check_order_date
        CHECK (order_date >= '2020-01-01')
)

Example 3: CREATE TABLE AS SELECT (CTAS)

All Platforms:

-- Create table from query results
CREATE TABLE high_value_customers AS
SELECT 
    c.customer_id,
    c.first_name,
    c.last_name,
    c.email,
    COUNT(o.order_id) as total_orders,
    SUM(o.total) as lifetime_value,
    MAX(o.order_date) as last_order_date
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
HAVING SUM(o.total) > 1000

-- Create table from query results
CREATE TABLE high_value_customers AS
SELECT 
    c.customer_id,
    c.first_name,
    c.last_name,
    c.email,
    COUNT(o.order_id) as total_orders,
    SUM(o.total) as lifetime_value,
    MAX(o.order_date) as last_order_date
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
HAVING SUM(o.total) > 1000

-- Create table from query results
CREATE TABLE high_value_customers AS
SELECT 
    c.customer_id,
    c.first_name,
    c.last_name,
    c.email,
    COUNT(o.order_id) as total_orders,
    SUM(o.total) as lifetime_value,
    MAX(o.order_date) as last_order_date
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
HAVING SUM(o.total) > 1000

Result: New table with data from SELECT query

Example 4: Platform-Specific CREATE TABLE

Snowflake - Partitioned Table:

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
CLUSTER BY (sale_date)

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
CLUSTER BY (sale_date)

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
CLUSTER BY (sale_date)

BigQuery - Partitioned and Clustered:

CREATE TABLE sales_data (
    sale_id INT64,
    sale_date DATE,
    product_id INT64,
    quantity INT64,
    revenue NUMERIC(10,2)
)
PARTITION BY sale_date
CLUSTER BY

CREATE TABLE sales_data (
    sale_id INT64,
    sale_date DATE,
    product_id INT64,
    quantity INT64,
    revenue NUMERIC(10,2)
)
PARTITION BY sale_date
CLUSTER BY

CREATE TABLE sales_data (
    sale_id INT64,
    sale_date DATE,
    product_id INT64,
    quantity INT64,
    revenue NUMERIC(10,2)
)
PARTITION BY sale_date
CLUSTER BY

Databricks - Delta Table:

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
USING DELTA
PARTITIONED BY (sale_date)

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
USING DELTA
PARTITIONED BY (sale_date)

CREATE TABLE sales_data (
    sale_id INT,
    sale_date DATE,
    product_id INT,
    quantity INT,
    revenue DECIMAL(10,2)
)
USING DELTA
PARTITIONED BY (sale_date)

Example 5: ALTER TABLE - Add Columns

All Platforms:

-- Add single column
ALTER TABLE customers
ADD COLUMN loyalty_points INT DEFAULT 0;

-- Add multiple columns
ALTER TABLE customers
ADD COLUMN (
    preferred_contact VARCHAR(20) DEFAULT 'email',
    marketing_opt_in BOOLEAN DEFAULT FALSE,
    last_login TIMESTAMP
)

-- Add single column
ALTER TABLE customers
ADD COLUMN loyalty_points INT DEFAULT 0;

-- Add multiple columns
ALTER TABLE customers
ADD COLUMN (
    preferred_contact VARCHAR(20) DEFAULT 'email',
    marketing_opt_in BOOLEAN DEFAULT FALSE,
    last_login TIMESTAMP
)

-- Add single column
ALTER TABLE customers
ADD COLUMN loyalty_points INT DEFAULT 0;

-- Add multiple columns
ALTER TABLE customers
ADD COLUMN (
    preferred_contact VARCHAR(20) DEFAULT 'email',
    marketing_opt_in BOOLEAN DEFAULT FALSE,
    last_login TIMESTAMP
)

customers table BEFORE:

customer_id

first_name

last_name

email

1

John

Smith

john@email.com

customers table AFTER:

customer_id

first_name

last_name

email

loyalty_points

preferred_contact

marketing_opt_in

last_login

1

John

Smith

john@email.com

0

email

FALSE

NULL

Example 6: ALTER TABLE - Modify Columns

Snowflake:

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
MODIFY COLUMN price DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
MODIFY COLUMN price DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
MODIFY COLUMN price DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

BigQuery:

-- Add column
ALTER TABLE products
ADD COLUMN IF NOT EXISTS description STRING;

-- Drop column
ALTER TABLE products
DROP COLUMN IF EXISTS old_column;

-- Rename column (requires re-creating)
ALTER TABLE products
RENAME COLUMN product_desc TO

-- Add column
ALTER TABLE products
ADD COLUMN IF NOT EXISTS description STRING;

-- Drop column
ALTER TABLE products
DROP COLUMN IF EXISTS old_column;

-- Rename column (requires re-creating)
ALTER TABLE products
RENAME COLUMN product_desc TO

-- Add column
ALTER TABLE products
ADD COLUMN IF NOT EXISTS description STRING;

-- Drop column
ALTER TABLE products
DROP COLUMN IF EXISTS old_column;

-- Rename column (requires re-creating)
ALTER TABLE products
RENAME COLUMN product_desc TO

Databricks:

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
ALTER COLUMN price TYPE DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
ALTER COLUMN price TYPE DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

-- Rename column
ALTER TABLE products
RENAME COLUMN product_desc TO description;

-- Change data type
ALTER TABLE products
ALTER COLUMN price TYPE DECIMAL(12,2);

-- Drop column
ALTER TABLE products
DROP COLUMN

Example 7: DROP TABLE

All Platforms:

-- Drop table (error if doesn't exist)
DROP TABLE temp_analysis;

-- Drop table if exists (safer)
DROP TABLE IF EXISTS temp_analysis;

-- Drop multiple tables
DROP TABLE IF EXISTS temp_table1, temp_table2,

-- Drop table (error if doesn't exist)
DROP TABLE temp_analysis;

-- Drop table if exists (safer)
DROP TABLE IF EXISTS temp_analysis;

-- Drop multiple tables
DROP TABLE IF EXISTS temp_table1, temp_table2,

-- Drop table (error if doesn't exist)
DROP TABLE temp_analysis;

-- Drop table if exists (safer)
DROP TABLE IF EXISTS temp_analysis;

-- Drop multiple tables
DROP TABLE IF EXISTS temp_table1, temp_table2,

Snowflake - Undrop:

-- Snowflake can recover dropped tables
UNDROP TABLE

-- Snowflake can recover dropped tables
UNDROP TABLE

-- Snowflake can recover dropped tables
UNDROP TABLE

Example 8: Temporary and Transient Tables

Snowflake:

-- Temporary table (session-only)
CREATE TEMPORARY TABLE session_data (
    session_id VARCHAR(100),
    user_id INT,
    event_data VARIANT
);

-- Transient table (no fail-safe, lower cost)
CREATE TRANSIENT TABLE staging_data (
    id INT,
    data VARIANT,
    loaded_at TIMESTAMP
)

-- Temporary table (session-only)
CREATE TEMPORARY TABLE session_data (
    session_id VARCHAR(100),
    user_id INT,
    event_data VARIANT
);

-- Transient table (no fail-safe, lower cost)
CREATE TRANSIENT TABLE staging_data (
    id INT,
    data VARIANT,
    loaded_at TIMESTAMP
)

-- Temporary table (session-only)
CREATE TEMPORARY TABLE session_data (
    session_id VARCHAR(100),
    user_id INT,
    event_data VARIANT
);

-- Transient table (no fail-safe, lower cost)
CREATE TRANSIENT TABLE staging_data (
    id INT,
    data VARIANT,
    loaded_at TIMESTAMP
)

BigQuery:

-- Temporary table
CREATE TEMP TABLE session_data AS
SELECT * FROM events WHERE session_id = @session_id;

-- Table with expiration
CREATE TABLE staging_data (
    id INT64,
    data STRING
)
OPTIONS(
    expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
)

-- Temporary table
CREATE TEMP TABLE session_data AS
SELECT * FROM events WHERE session_id = @session_id;

-- Table with expiration
CREATE TABLE staging_data (
    id INT64,
    data STRING
)
OPTIONS(
    expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
)

-- Temporary table
CREATE TEMP TABLE session_data AS
SELECT * FROM events WHERE session_id = @session_id;

-- Table with expiration
CREATE TABLE staging_data (
    id INT64,
    data STRING
)
OPTIONS(
    expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
)

Databricks:

-- Temporary view
CREATE TEMPORARY VIEW session_data AS
SELECT * FROM events WHERE session_id = current_session_id();

-- Global temporary view
CREATE GLOBAL TEMPORARY VIEW shared_staging AS
SELECT * FROM

-- Temporary view
CREATE TEMPORARY VIEW session_data AS
SELECT * FROM events WHERE session_id = current_session_id();

-- Global temporary view
CREATE GLOBAL TEMPORARY VIEW shared_staging AS
SELECT * FROM

-- Temporary view
CREATE TEMPORARY VIEW session_data AS
SELECT * FROM events WHERE session_id = current_session_id();

-- Global temporary view
CREATE GLOBAL TEMPORARY VIEW shared_staging AS
SELECT * FROM

Data Types by Platform

Category

Snowflake

BigQuery

Databricks

Integer

INT, BIGINT

INT64

INT, BIGINT

Decimal

DECIMAL(p,s), NUMBER

NUMERIC, DECIMAL

DECIMAL(p,s)

String

VARCHAR, STRING

STRING

STRING, VARCHAR

Boolean

BOOLEAN

BOOL

BOOLEAN

Date

DATE

DATE

DATE

Timestamp

TIMESTAMP

TIMESTAMP

TIMESTAMP

JSON

VARIANT, OBJECT

JSON

STRING

Array

ARRAY

ARRAY

ARRAY

Common Constraints

Constraint

Purpose

Example

PRIMARY KEY

Unique identifier

customer_id INT PRIMARY KEY

FOREIGN KEY

Referential integrity

FOREIGN KEY (customer_id) REFERENCES customers(customer_id)

UNIQUE

No duplicates

email VARCHAR(100) UNIQUE

NOT NULL

Required value

name VARCHAR(50) NOT NULL

CHECK

Value validation

CHECK (age >= 18)

DEFAULT

Default value

status VARCHAR(20) DEFAULT 'Active'

Common Use Cases

  1. CREATE TABLE: Define new data structures, staging tables

  2. ALTER TABLE: Add columns, schema evolution, rename

  3. DROP TABLE: Clean up temporary tables, remove obsolete data

  4. CTAS: Create summary/aggregation tables, backups

Best Practices

  1. Use appropriate data types: Choose smallest type that fits data

  2. Add constraints: Ensure data quality at schema level

  3. Document tables: Add comments describing purpose

  4. Use IF EXISTS: Safer DROP operations

  5. Plan for growth: Consider partitioning for large tables

  6. Version control DDL: Track schema changes

  7. Test in dev first: Validate changes before production

  8. Backup before ALTER: Safety first for production tables

  9. Use descriptive names: Clear table and column names

  10. Consider indexing: Add indexes for query performance

Interested to Learn More?
Try Out the Free 14-Days Trial

More Articles

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

decorative icon

Experience Analytics for the AI-Era

Start your 14-day trial today - it's free and no credit card needed

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.

Copyright © 2026 Paradime Labs, Inc.

Made with ❤️ in San Francisco ・ London

*dbt® and dbt Core® are federally registered trademarks of dbt Labs, Inc. in the United States and various jurisdictions around the world. Paradime is not a partner of dbt Labs. All rights therein are reserved to dbt Labs. Paradime is not a product or service of or endorsed by dbt Labs, Inc.