Home » Science & Technology » Snowflake SNOWPIPE: Building a Continuous Data Ingestion Service w/ Azure Blob Storage Using Portal

Snowflake SNOWPIPE: Building a Continuous Data Ingestion Service w/ Azure Blob Storage Using Portal

Written By Joyce Kay Avila on Sunday, Jan 31, 2021 | 08:53 PM

 
Automating Snowpipe for Azure Blob Storage from Beginning to End for Novice (First-Time) Azure and Snowflake Users. Create a fully scalable serverless data pipeline between Azure storage and a destination Snowflake table in 25 steps in only 25 minutes. This step-by-step tutorial is a MUST for anyone studying for the Snowflake Advanced Architect Exam Certification as well as for anyone who has a special interest in this topic. #Snowflake #Snowpipe #DataStreaming #Azure #Architect 1 - Gain Access to Azure (Azure) 2 - Obtain Tenant ID (Azure) 3 - Activate Your Free Trial Account (Azure) 4 - Create a Resource Group (Azure) 5 - Create a Storage Account within the Resource Group (Azure) 6 - Create a Container (Azure) 7 - Create a Queue (Azure) 8 - Create an Event Subscription (Azure) 9 - Get the Queue URL (Azure) 10 - Set Up Snowflake Trial Account (Snowflake) 11 - Change Your Role to ACCOUNTADMIN (Snowflake) 12 - Create New Database (Snowflake) 13 - Create New Table (Snowflake) 14 - Create Notification Integration (Snowflake) 15 - Get AZURE CONSENT URL (Snowflake) 16 - Add Role Assignment (Azure) 17 - Get Endpoint (Azure) 18 - Generate Shared Access Signature (Azure) 19 - Create External Stage (Snowflake) 20 - Create Snowpipe (Snowflake) 21 - Create CSV files 22 - Upload Files to Azure Blob Storage (Azure) OPTIONAL: 23 - Check Status of Pipe (Snowflake) 24 - Confirm data is in Snowflake table (Snowflake) 25 - Look at the Load History (Snowflake) Find me on LinkedIn: linkedin.com/n/JoyceKayAvila Find me on Twitter: twitter.com/JoyceKayAvila Fine me on Youtube: youtube.com/c/JoyceKayAvila