S3 / Azure / GCS
Overview
This article describes how to configure your S3-compatible storage (AWS S3, GCS, Azure Blob) to allow Birdie to export data to a dedicated bucket.
Birdie can export processed data (e.g., enriched feedback, areas, opportunities) to your S3-compatible storage. We recommend creating:
A dedicated bucket or prefix for Birdie exports
Credentials with write access to the export location
This isolates write access from your existing data.
Step 1: Create a dedicated bucket or prefix for exports
Create a dedicated bucket or prefix where Birdie will write export files:
s3://your-bucket/birdie-exports/Step 2: Configure write permissions
Birdie uses the same authentication mechanisms for export as for data ingestion. The credentials must have write access to the export bucket/prefix.
For detailed instructions on generating credentials for each provider, refer to the corresponding section in the Data Ingestion with S3 documentation:
GCS: HMAC Keys
Azure Blob Storage: SAS Token (recommended) or Shared Key
Once you have generated the credentials, you will have to grant write access to your bucket for the user / role:
AWS: Grant read and write access for the specific Bucket and Objects.
GCS: Grant the Storage Object User and Storage Bucket Viewer roles for the specific bucket.
Azure Blob Storage: Grant the Storage Blob Data Contributor role.
Step 3: Share export configuration with Birdie
Share the following details securely with Birdie:
Export bucket name
Region
Credentials with write access (Access Key ID / Secret Key)
Other optional parameters (External ID, Role ARN, S3 endpoint url, etc.)
Birdie will configure the export connector and confirm once the integration is active.
If you don't know how to generate/obtain any of these items, please refer to the S3 data ingestion setup.
How the Export Works
Birdie delivers daily exports via folders in your bucket:
Each folder contains the standard set of export files:
feedbacks.csvareas.csvopportunities.csvetc.
The complete set of files is documented in the data forwarding documentation.
Birdie's exported content will be incremental for some of the files. This means you can't rely solely on the latest file to have the full picture of the data you have inside Birdie.
We recommend you use a Data Warehouse or SQL Engine of choice to consolidate and deduplicate these entries so you can use them analytically.
You'll be responsible for keeping an up-to-date, deduplicated copy of each of these tables.
Exported Data Format
The exported data content will match the schemas described in our data forwarding documentation.
You can also check this documentation to better understand how to query the consolidated copies of the exported data.
If the schema evolves over time, you'll need to update your table definitions to make sure to include the new columns within your table.
You also need to create a Warehouse / Database system and ETL Pipeline where you consolidate and deduplicate the data. More details on how to do this are given in the data forwarding documentation.
References
Last updated