Skip to main content
Azure Blob to Azure SQL Database

Configure an Azure Data Factory pipeline to send from Azure blob storage to a Azure SQL database.

Daniel Norén avatar
Written by Daniel Norén
Updated over 7 months ago

Overview

If you would like to access your Funnel data in an Azure Database, you can do so by setting up an Azure Blob storage export and pipe the data into the database. Funnel's Data Warehouse connector can export all your data to blob storage, so once you have a file in your container, all you need to do is to build a Data Factory pipeline to move the data. 

Information on how to configure an Azure Blob export can be found here.

It is recommended that you connect a database to Power Bi instead of connecting directly to Blob Storage.

You can read more here about setting up an Azure SQL Database. 

Create a table in your database

The first thing you need to do is create a table in your database where the data exported from Funnel should be sent.

Note: A metadata file containing a SQL Create table statement is sent to blob storage together with all exported data. Open this file in a text editor and copy the statement to your database and run it.

Create a Copy Data pipeline in DataFactory

Setting up the pipeline contains multiple components, two linked services, two datasets and a copy data function to actually copy the data.

Create two linked services


Create a linked service for Azure Blob Storage and select the storage account where your Funnel data is located. 

Create another linked service for Azure SQL Database and select the database table created in the first step.

Create two new Datasets

Note: Click on 'Actions' -> 'New data set'  in order to create a new dataset

Create a new dataset for Azure Blob storage with the file format  “DelimitedText”. 

Enter the linked service created above and the path to the folder containing the Funnel data. 

Go to the next step, click on the Connection tab and select Compression type and Column delimiter depending on how your Funnel data is exported. 

Click on the schema tab and make sure the schema is imported.

Create a new dataset for Azure SQL Database.

Enter the linked service created above and credentials to the Azure Server. In the next step select the database table that you created in the first step.

Copy data pipeline

Create a new pipeline and drag the “Copy data” into the work board.


Select the Azure Blob Dataset as 'source' and the Azure SQL Database dataset as 'sink' in the Copy Data job. Map the schemas by clicking “Import schemas”.

The last step is to add a "delete from [table_name]" pre-script procedure to avoid duplicated data.

Configure trigger

It’s also possible to configure triggers in order to run the pipeline. The most common triggers are a timer based or an event listener for when a file is created. By default, Funnel will send a summary file when the entire export has been completed. This file can be used to trigger the pipeline. 

Did this answer your question?