# Snowflake

## Overview

Birdie integrates with Snowflake using read-only access. Two integration models are supported:

* **Reader Account (Snowflake Data Sharing)** — recommended for shared-data environments
* **Direct OAuth Connection** — recommended when Birdie connects directly to your main Snowflake account

Birdie can import data from Snowflake into Birdie for analytics, enrichment and reporting purposes, including:

* Customers and accounts
* Operational and reference tables
* Feedback-related datasets exposed by the client

Because each Snowflake installation is unique, Birdie's team will work with your IT administrators to finalize details. This article explains how your Snowflake admin can generate and provide the credentials needed to enable the integration.

***

## Schema requirements

All Birdie database connectors (Snowflake, Databricks, BigQuery, etc.) require one table or view per feedback type, following the Birdie schema definition.

Examples of feedback types:

* nps
* csat
* review
* social\_media\_post
* support\_ticket

Each feedback type must exist as a table or a view.

Birdie provides the detailed schema reference separately (same model as the [S3 schema documentation](https://ask.birdie.ai/integrations-and-data-ingestion/how-to-integrate-with.../s3-azure-gcs)).

***

### Incremental Partition Column

Birdie uses a partition column to determine which rows to ingest on each run. Typical queries look like:

```sql
SELECT *
FROM <DATABASE>.<SCHEMA>.<TABLE>
WHERE DATE(<partition_column>) BETWEEN :start AND :end;
```

#### Requirements

The partition column **must reflect when a row was added or last modified in the table**, not when the original event occurred. For example:

* `LOAD_TIMESTAMP` or `UPDATED_AT` (ETL/pipeline timestamp) — **recommended**
* `INSERTED_AT` (row creation timestamp) — acceptable if the table is append-only

#### Why `POSTED_AT` is not sufficient

Using a business event timestamp like `POSTED_AT` can cause **data gaps**:

1. **Historical updates missed**: If a past row is updated (e.g., a ticket status changes), its `POSTED_AT` remains in the past and Birdie will not re-ingest it.
2. **Pipeline timing gaps**: If your ETL runs at specific times (e.g., every 6 hours), rows loaded between Birdie's query window and the ETL run may not appear until the next cycle. An `UPDATED_AT` or `LOAD_TIMESTAMP` column tied to the ETL timestamp ensures these rows are captured.

#### Recommendation

Add a `LOAD_TIMESTAMP` or `UPDATED_AT` column to your table that is set to `CURRENT_TIMESTAMP()` whenever a row is inserted or modified. Use this column as the `partition_column` when configuring the Birdie integration.

If your table is append-only and never updated, `POSTED_AT` is acceptable as the partition column.

***

## Reader Account method

### Overview

Birdie connects to a Snowflake Reader Account created via Snowflake Data Sharing. Birdie runs `SELECT` queries against databases instantiated from shared objects in that reader account.

This model is common when:

* Your organization already uses provider-to-consumer data sharing
* You want strict isolation between your production Snowflake account and external consumers such as Birdie

#### Requirements

Before starting:

* Your account supports provider [data sharing](https://docs.snowflake.com/en/user-guide/data-share-intro)
* You have `ACCOUNTADMIN` (or equivalent) access on the reader account
* The provider account already exposes the necessary shares with the data Birdie will use

### Setup in Snowflake

{% stepper %}
{% step %}

### Log into the Reader Account

Use SnowSQL, the Snowflake Web UI, or another supported client:

{% code title="Set a role" %}

```sql
USE ROLE ACCOUNTADMIN;
```

{% endcode %}
{% endstep %}

{% step %}

### (Optional) Create a dedicated role for Birdie

{% code title="Create a dedicated role" %}

```sql
CREATE ROLE BIRDIE_READ;
```

{% endcode %}
{% endstep %}

{% step %}

### Create the Birdie user inside the reader account

{% code title="Create user" %}

```sql
CREATE USER BIRDIE_USER
  PASSWORD = '<STRONG_PASSWORD>'
  DEFAULT_ROLE = BIRDIE_READ
  MUST_CHANGE_PASSWORD = FALSE;
```

{% endcode %}

{% code title="Grant user" %}

```sql
GRANT ROLE BIRDIE_READ TO USER BIRDIE_USER;
```

{% endcode %}
{% endstep %}

{% step %}

### (Optional) Create a resource monitor

{% code title="Create monitor" %}

```sql
CREATE RESOURCE MONITOR BIRDIE_MONITOR WITH CREDIT_QUOTA = 1000;
ALTER WAREHOUSE BIRDIE_WH SET RESOURCE_MONITOR = BIRDIE_MONITOR;
```

{% endcode %}

You can adapt the name and quota as needed.
{% endstep %}

{% step %}

### Create a virtual warehouse in the reader account

{% code title="Create warehouse" %}

```sql
CREATE WAREHOUSE BIRDIE_WH
  WAREHOUSE_SIZE = SMALL
  AUTO_SUSPEND = 60
  AUTO_RESUME = TRUE;
```

{% endcode %}
{% endstep %}

{% step %}

### Instantiate databases from provider shares

{% code title="Create shares" %}

```sql
CREATE DATABASE SHARED_DB_1 FROM SHARE <PROVIDER_ACCOUNT>.<SHARE_NAME_1>;
CREATE DATABASE SHARED_DB_2 FROM SHARE <PROVIDER_ACCOUNT>.<SHARE_NAME_2>;
```

{% endcode %}
{% endstep %}

{% step %}

### Grant privileges to the Birdie role

Warehouse usage:

{% code title="Grant role" %}

```sql
GRANT USAGE ON WAREHOUSE BIRDIE_WH TO ROLE BIRDIE_READ;
```

{% endcode %}

Imported privileges on shared databases

{% code title="Grant role" %}

```sql
GRANT IMPORTED PRIVILEGES ON DATABASE SHARED_DB_1 TO ROLE BIRDIE_READ;
GRANT IMPORTED PRIVILEGES ON DATABASE SHARED_DB_2 TO ROLE BIRDIE_READ;
```

{% endcode %}

Depending on how the share was configured, you may instead need to grant a shared database role to `BIRDIE_READ`. See: [Snowflake GRANT ROLE](https://docs.snowflake.com/en/sql-reference/sql/grant-role)
{% endstep %}
{% endstepper %}

## Integration via Direct OAuth Connection

### Overview

In this model, Birdie connects directly to your main Snowflake account using OAuth instead of static passwords.\
\
This is recommended when:

* You prefer SSO or OAuth-based access
* You do not use Snowflake Data Sharing
* You want Birdie to access native production schemas directly

### Requirements

* ACCOUNTADMIN (or equivalent) privileges in the main Snowflake account
* Access to the Snowflake web console or CLI
* The Snowflake connector has been enabled on the Birdie side

### Setup in Snowflake

{% stepper %}
{% step %}

### Create the OAuth Security Integration

{% code title="Create an integration" %}

```sql
CREATE SECURITY INTEGRATION BIRDIE_OAUTH_INTEGRATION
   TYPE = OAUTH
   ENABLED = TRUE
   OAUTH_CLIENT = CUSTOM
   OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
   OAUTH_REDIRECT_URI = 'https://api.birdie.ai/oauth/callback'
   OAUTH_ISSUE_REFRESH_TOKENS = TRUE
   OAUTH_REFRESH_TOKEN_VALIDITY = 7776000;
```

{% endcode %}

{% code title="Verify" %}

```sql
DESC SECURITY INTEGRATION BIRDIE_OAUTH_INTEGRATION;
```

{% endcode %}
{% endstep %}

{% step %}

### Create a role and user for Birdie

{% code title="Create role" %}

```sql
CREATE ROLE BIRDIE_READ;
```

{% endcode %}

{% code title="Grant role" %}

```sql
GRANT USAGE ON WAREHOUSE <WAREHOUSE_NAME> TO ROLE BIRDIE_READ;
GRANT USAGE ON DATABASE <DATABASE_NAME> TO ROLE BIRDIE_READ;
GRANT USAGE ON SCHEMA <SCHEMA_NAME> TO ROLE BIRDIE_READ;
GRANT SELECT ON ALL TABLES IN SCHEMA <SCHEMA_NAME> TO ROLE BIRDIE_READ;
```

{% endcode %}

{% code title="Create user" %}

```sql
CREATE USER BIRDIE_INTEGRATION_USER
  PASSWORD = '<SECURE_PASSWORD>'
  DEFAULT_ROLE = BIRDIE_READ
  MUST_CHANGE_PASSWORD = FALSE;
```

{% endcode %}

{% code title="Grant role" %}

```sql
GRANT ROLE BIRDIE_READ TO USER BIRDIE_INTEGRATION_USER;
```

{% endcode %}
{% endstep %}

{% step %}

### Generate and test an OAuth token

{% code title="Example using SnowSQL" %}

```shellscript
snowsql
  -a <ACCOUNT_IDENTIFIER>
  -u BIRDIE_INTEGRATION_USER
  -o authenticator=oauth
  -o oauthAccessToken="<ACCESS_TOKEN>"
  -r BIRDIE_READ
  -w <WAREHOUSE_NAME>
  -d <DATABASE_NAME>
  -s <SCHEMA_NAME>
```

{% endcode %}
{% endstep %}
{% endstepper %}

***

### Share Connection Details with Birdie

To configure the integration, securely provide Birdie with the following information.

**For Reader Account (Password auth):**

```json
{
  "connection_details": {
    "account": "your-account-identifier",
    "warehouse": "BIRDIE_WH",
    "role": "BIRDIE_READ",
    "auth": {
      "method": "password",
      "username": "BIRDIE_USER",
      "password": "your-secure-password"
    }
  },
  "data_source": {
    "database": "SHARED_DB_1",
    "schema": "PUBLIC",
    "table": "NPS_FEEDBACK",
    "partition_column": "LOAD_TIMESTAMP"
  },
  "metadata": {
    "data_kind": "Support Ticket, NPS, Account"
  }
}
```

**For Direct OAuth:**

```json
{
  "connection_details": {
    "account": "your-account-identifier",
    "warehouse": "BIRDIE_WH",
    "role": "BIRDIE_READ",
    "auth": {
      "method": "oauth",
      "username": "BIRDIE_INTEGRATION_USER",
      "password": "your-secure-password",
      "oauth_client_id": "your-oauth-client-id",
      "oauth_client_secret": "your-oauth-client-secret"
    }
  },
  "data_source": {
    "database": "MARKETING_DB",
    "schema": "PUBLIC",
    "table": "NPS_FEEDBACK",
    "partition_column": "LOAD_TIMESTAMP"
  },
  "metadata": {
    "data_kind": "Support Ticket, NPS, Account"
  }
}
```

<table><thead><tr><th width="213">Section</th><th>Field</th><th width="131">Required</th><th>Description</th></tr></thead><tbody><tr><td>connection_details</td><td>account</td><td>Yes</td><td>Snowflake account identifier, e.g. <code>abc12345.us-east-1</code>.</td></tr><tr><td>connection_details</td><td>warehouse</td><td>Yes</td><td>Snowflake warehouse name, e.g. <code>BIRDIE_WH</code>.</td></tr><tr><td>connection_details</td><td>role</td><td>Yes</td><td>Snowflake role, e.g. <code>BIRDIE_READ</code>.</td></tr><tr><td>connection_details.auth</td><td>method</td><td>Yes</td><td>Authentication mode: <code>password</code> (default) or <code>oauth</code>.</td></tr><tr><td>connection_details.auth</td><td>username</td><td>Yes</td><td>Snowflake username, e.g. <code>BIRDIE_USER</code>.</td></tr><tr><td>connection_details.auth</td><td>password</td><td>Password</td><td>Password for the Birdie user.</td></tr><tr><td>connection_details.auth</td><td>oauth_client_id</td><td>OAuth</td><td>OAuth client ID from the security integration.</td></tr><tr><td>connection_details.auth</td><td>oauth_client_secret</td><td>OAuth</td><td>OAuth client secret from the security integration.</td></tr><tr><td>data_source</td><td>database</td><td>Yes</td><td>Snowflake database name.</td></tr><tr><td>data_source</td><td>schema</td><td>Yes</td><td>Snowflake schema name.</td></tr><tr><td>data_source</td><td>table</td><td>Yes</td><td>Table or view name.</td></tr><tr><td>data_source</td><td>partition_column</td><td>No</td><td>Column for date filtering. Defaults to <code>posted_at</code>.</td></tr><tr><td>metadata</td><td>data_kind</td><td>Yes</td><td>The type(s) of data in the table, e.g. <code>Support Ticket</code>, <code>NPS</code>, <code>CSAT</code>, <code>Review</code>, <code>Account</code>.</td></tr></tbody></table>

[Share credentials securely](https://ask.birdie.ai/integrations-and-data-ingestion/securely-sharing-credentials-for-data-integration) with the Birdie team. Never send credentials via email or unencrypted channels.

***

## Security notes

* Birdie never requires admin privileges for day-to-day operations
* Access granted to Birdie is read-only, unless a specific export pipeline is explicitly configured
* All credentials must be shared via a secure channel
* OAuth is preferred over static passwords whenever possible

## Data in scope

Once enabled, the Snowflake connector can import data from Snowflake into Birdie (for example, customer, account, or operational tables you expose to the Birdie role) and export Birdie data such as feedback records, areas, opportunities, and metrics back into Snowflake.

## References

* [Snowflake Data Sharing](https://docs.snowflake.com/en/user-guide/data-share-intro)
* [Snowflake Security & Roles](https://docs.snowflake.com/en/user-guide/security-access-control-overview)
* [Snowflake OAuth](https://docs.snowflake.com/en/user-guide/oauth-intro)
* [Securely Sharing Credentials](https://ask.birdie.ai/integrations-and-data-ingestion/securely-sharing-credentials-for-data-integration)
