Let’s suppose I have a csv file in Azure Blob Storage and I want to import it to my local SQL Server as a csv file.

We will use the SQL Server Integration Services (SSIS) task named Azure Blob Source, which is installed with the Feature Pack for SSIS to connect to Azure.

Requirements:

i. SQL Server 2016.
ii. The SSDT installed.
iii. An Azure Subscription.
iv. Internet Access.
v. Microsoft Azure Storage Explorer installed (MASE).
vi. Feature Pack in SSIS for Azure installed.
vii. An Azure Storage Account a Blob Container created.


Getting Started:

1. Create a text file name Product_data.txt with the following content in the Sample file:





2. Using MASE, upload the file of step 1 to the Azure Blob container myblobtestazure.




3. Using SSDT, create a SSIS project. Drag and drop a Data Flow Task to the design pane:




4. Double click Data Flow. Drag and drop Azure Blob Source. This is a new task included in the SSIS Feature Pack for Azure installed as part of the prerequisites. This task will be used to connect to the csv file in Azure:




Double click Azure Blob Source and fill the required information.

5. Drag and drop a Flat File Destination and join this task with the Azure Blob Source:




Double click Flat File Destination and fill the required information along with Mappings tab to map source columns with destination columns:

6. As same we can make the connection from Azure portal to SQL server on Premise, but we need to use for that multicast to distribute the data for both the connection.




7. Start the package to copy the Azure information from the csv file to local server and sql database table. To view the data runtime, we can enable the data viewer in multicast toll.






8. Here is multicasting output in runtime to view the data flow and data.




9. If everything is OK, a new file will be created in the path specified and table loaded in the database.