ÃÛ¶¹ÊÓÆµ

Configure the Azure Blob Destination

Learn how to set up a connection and send data to the Azure Blob Storage destination in Real-Time Customer Data Platform. This destination supports exporting datasets and audiences, and allows you to customize the file headers and data attributes.

For more information, please visit the documentation.

video poster

Transcript
In this video, I’ll show you how to configure a connection to the Microsoft Azure Blob destination and then send data using the Experience Platform user interface. Let’s get started with establishing the connection first. I’m logged into Experience Platform and selected Destinations below Connections. This opens the Catalog view. Next I’ll scroll to the Categories section and select Cloud Storage. Here is the destination card for the Azure Blob storage. Since I don’t have any previous connections saved, I’ll select Setup. This opens the workflow beginning with configuring the connection through sending data to the storage account. The first thing I need to do is enter the connection string. I’ll quickly switch to a slide that shows you where to get this from the Azure portal once you have a storage account configured. When you open the storage account, select Access Keys under Security & Networking. Show the connection string and copy it. It starts with default endpoints protocol and make sure you copy the entire string. Now I’ll paste this string into the text field in the workflow. Adding an encryption key to attach to your exported files is optional. I won’t be doing this for my demonstration, but I highly recommend you do. You can install open source tools that will take care of creating public and private keys, but it’s the public key value that’s copied here. To finish the connection, I’ll click on Connect to Destination. The connection is now confirmed. You’d receive an error message if it’s unsuccessful. Additional fields now appear on this configuration view. These are the storage account destination details. This destination supports sending datasets, prospects, and audiences from Experience Platform. There’s a separate video about sending datasets if you’re interested in learning about that. I’ll be sending an audience for my demo, so I’ll keep that selection. Then I’ll fill in the name, description, folder path, and container fields. The last two fields specify the destination folder and container from the storage account. You can get these values from the Azure portal. For the field type, I have two options, JSON and Parquet. I’ll choose JSON. Towards the bottom, I can choose a compression format, which I’ll do now. The choice is relative to the file type selected above. If you’re working with the production sandbox, you’ll have the option to set up alerts. I’m finished with these inputs, so I’ll select Next in the top right to move forward. This step of Configure New Destination is prompting me to select the marketing action appropriate for this connection. I’ll choose Email Targeting from the list and then Next at the top. Because I chose audiences in the destination detail step, I’m presented with the list of audiences in my Experience Platform sandbox. If you have a lot of audiences in your sandbox, you can use the search bar to filter the list. Since I’m using a development sandbox, I don’t have many. I’ll choose the Luma customers with level gold or above, then I’ll select Next to move forward. On the Scheduling step, I can specify whether I’m exporting a new audience, or audiences, or those that have already been activated. Let’s review the scheduling options. Here you can decide whether you want to export the full file as a one-time operation, or export incremental files that contain new data for people who become part of the audience over time. Next, I can choose a frequency setting and a start time. You can specify a precise time or send the data after Experience Platform processes segment evaluation. This option ensures that the most up-to-date audience profiles are exported after the daily batch segmentation job finishes. I could customize the start date as well. You can modify the file name shared to the storage account. The first item is a file name preview, and below that there’s a lot of flexibility to append settings to the file name, or choose a different date and time setting to append, or add custom text. I’m going to keep the standard file name for my demo, so I’ll cancel out of this step. Once I’m done with the scheduling settings, I’ll go to the next step. The Mapping step lets you customize the data going out. It presents you with recommendations to map the source field, which is the XDM source field from the schema, to the attribute fields. You can add a calculated field as well. I encourage you to explore all of the options available to you there. You can also add a new mapping field. If you want to send additional attributes beyond the recommended field mappings, you do that by opening this modal and choosing the fields. I’m going to leave most of the recommended fields, except the last field, which I’ll remove by clicking the Delete Mapping icon next to it. Mandatory attributes ensure that exported profiles contain specific attributes, like email address, and only those profiles with the attributes are exported. The duplication keys, on the other hand, help identify and handle duplicate records, allowing you to specify fields to identify duplicates and choose how many unique duplicates to keep. I won’t make any further changes, so I’ll select Next above. This is the review step where I can verify my settings. Once everything looks good, I’ll select Finish in the upper right corner. That’s it. I receive a confirmation that my destination has been successfully saved. After this, you would connect to the Azure Storage account to confirm that you see the files there. This concludes the demo for configuring and sending Experience Platform data to the Azure Blob destination. Thanks for watching!
recommendation-more-help
9051d869-e959-46c8-8c52-f0759cee3763