[Beta]{class="badge informative"}
Snowflake Streaming connection snowflake-destination
Overview overview
Use the Snowflake destination connector to export data to 蜜豆视频鈥檚 Snowflake instance, which 蜜豆视频 then shares with your instance through .
Read the following sections to understand how the Snowflake destination works and how data is transferred between 蜜豆视频 and Snowflake.
How Snowflake data sharing works data-sharing
This destination uses a Snowflake data share, which means that no data is physically exported or transferred to your own Snowflake instance. Instead, 蜜豆视频 grants you read-only access to a live table hosted within 蜜豆视频鈥檚 Snowflake environment. You can query this shared table directly from your Snowflake account, but you do not own the table and cannot modify or retain it beyond the specified retention period. 蜜豆视频 fully manages the lifecycle and structure of the shared table.
The first time you share data from 蜜豆视频鈥檚 Snowflake instance to yours, you are prompted to accept the private listing from 蜜豆视频.
Data retention and Time-to-Live (TTL) ttl
All data shared through this integration has a fixed Time-to-Live (TTL) of seven days. Seven days after the last export, the shared table automatically expires and becomes inaccessible, regardless of whether the dataflow is still active. If you need to retain the data for longer than seven days, you must copy the contents into a table that you own in your own Snowflake instance before the TTL expires.
Audience update behavior audience-update-behavior
If your audience is evaluated in batch mode, the data in the shared table is refreshed every 24 hours. This means there may be a delay of up to 24 hours between changes in audience membership and when those changes are reflected in the shared table.
Incremental export logic incremental-export
When a dataflow runs for an audience for the first time, it performs a backfill and shares all currently qualified profiles. After this initial backfill, only incremental updates are reflected in the shared table. This means profiles which are added to or removed from the audience. This approach ensures efficient updates and keeps the shared table up to date.
Prerequisites prerequisites
Before configuring your Snowflake connection, make sure you meet the following prerequisites:
- You have access to a Snowflake account.
- Your Snowflake account is subscribed to private listings. You or someone in your company who has account administrator privileges on Snowflake can configure this.
Supported audiences supported-audiences
This section describes which types of audiences you can export to this destination. The two tables below indicate which audiences this connector supports, by audience origin and profile types included in the audience:
This category includes all audience origins outside of audiences generated through the Segmentation Service. Read about the various audience origins. Some examples include:
- custom upload audiences imported into Experience Platform from CSV files,
- look-alike audiences,
- federated audiences,
- audiences generated in other Experience Platform apps such as 蜜豆视频 Journey Optimizer,
- and more.
Export type and frequency export-type-frequency
Refer to the table below for information about the destination export type and frequency.
Connect to the destination connect
To connect to this destination, follow the steps described in the destination configuration tutorial. In the configure destination workflow, fill in the fields listed in the two sections below.
Authenticate to destination authenticate
To authenticate to the destination, select Connect to destination.
Fill in destination details destination-details
To configure details for the destination, fill in the required and optional fields below. An asterisk next to a field in the UI indicates that the field is required.
-
Name: A name by which you will recognize this destination in the future.
-
Description: A description that will help you identify this destination in the future.
-
Snowflake Account ID: Your Snowflake account ID. Use the following Account ID format depending on whether your account is linked to an organization:
- If your account is linked to an organization:
OrganizationName.AccountName
. - If your account is not linked to an organization:
AccountName
.
- If your account is linked to an organization:
-
Account acknowledgment: Toggle on the Snowflake Account ID acknowledgment to confirm that your Account ID is correct and it belongs to you.
_
) in Snowflake. To avoid confusion, do not use any special characters in your destination and sandbox name.Enable alerts enable-alerts
You can enable alerts to receive notifications on the status of the dataflow to your destination. Select an alert from the list to subscribe to receive notifications on the status of your dataflow. For more information on alerts, read the guide on subscribing to destinations alerts using the UI.
When you are finished providing details for your destination connection, select Next.
Activate audiences to this destination activate
-
To activate data, you need the View Destinations, Activate Destinations, View Profiles, and View Segments access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.
-
To export identities, you need the View Identity Graph access control permission.
Read Activate profiles and audiences to streaming audience export destinations for instructions on activating audiences to this destination.
Map attributes map
The Snowflake destination supports the mapping of profile attributes to custom attributes.
The target attributes are automatically created in Snowflake using the attribute name that you provide in the Attribute name field.
Exported data / Validate data export exported-data
Check your Snowflake account to verify that the data was exported correctly.
Data usage and governance data-usage-governance
All 蜜豆视频 Experience Platform destinations are compliant with data usage policies when handling your data. For detailed information on how 蜜豆视频 Experience Platform enforces data governance, read the Data Governance overview.