Azure Blob Storage Staging Bucket
Some sources or destinations without built-in staging resources require a staging bucket to efficiently transfer or ingest data.
Create storage account
- In the Azure portal, navigate to the Storage accounts service and click + Create.
- In the "Basics" tab of the "Create a storage account" form, fill in the required details.
- In the "Advanced" settings, under "Security" make sure Enable storage account key access is turned on. You may turn off (deselect) "Allow enabling public access on containers". Under "Data Lake Storage Gen2", select Enable hierarchical namespace.
- In the "Networking" settings, you may limit "Network access" to Enable public access from select virtual networks and IP addresses. All other settings can use the default selections.
- In the "Data protection" settings, you must turn off Enable soft delete for blobs, Enable soft delete for containers, and Enable soft delete for file shares.
- Once the remaining options have been configured to your preference, click Create.
Create bucket and access token
- In the Azure portal, navigate to the Storage accounts service and click on the account that was created in the previous step.
- In the navigation pane, under "Data storage", click Containers. Click + Container, choose a name for the container, and click Create.
- In the navigation pane, under "Security + networking", click Shared access signature.
- In the "Allowed services" list, select Blob and File. In the "Allowed resource types" list, select Container and Object. In the "Allowed permissions" list, select Read, Write, Delete, List, Add, Create, and Permanently Delete.
- Select a "Start and expiry date/time" based on your security posture, and click Generate SAS and connection string.
- Make a note of the SAS token that is generated.
You're done!
Use this configured Azure Blob Storage staging bucket during the connection of your preferred data destination.
Updated about 1 year ago