How to read / write from Azure Data Lake Storage Gen2 in SSIS

If you like to read data stored in Azure Blob Storage Gen 2 using SSIS then you can use ZappySys SSIS Azure Storage Source for CSV , JSON or XML format.

Useful SSIS Components to read / write data from Azure Data Lake Storage Gen1 or Gen2

It can do pretty nice things which you wont find in other components such as

  • Read GZip compressed files
  • Read multiple files using wildcard pattern
  • Recursive Scan (Read many files from deep folder structure)
  • Intelligent Metadata scan

Read the below article to see how to read files from Azure Storage using SSIS

Verifying if your Azure Storage Account is Gen 2 or Gen 1
Open Azure Storage Explorer and see Account Kind. If it says StorageV2 means its Gen2 and if says Storage then Gen1

Loading data from Azure Data Lake Storage Gen 2 using SSIS (CSV / XML / JSON Files)

Write data to Azure Data Storage using SSIS Azure Blob Destination (CSV)
Here is the component which can write data to Azure Blog Storage in CSV Format
SSIS Azure Blob CSV File Destination32x32 Azure Blob CSV File Destination

Write data to Azure Data Storage using SSIS Azure Blob Destination (XML or JSON)
If you like to write data to XML /JSON Format then you can try some workaround like below.

  1. First Generate JSON or XML using JSON Generator Transform or XML Generator Transform.
  2. Then use Azure Blob CSV Destination
  3. Select Runtime Connection
  4. Select Input Column from JSON / XML Generator
  5. Set following Properties

FilePath = some-path (E.g. /myfolder/myfile.json)
FirstRowHasColumnNames=false
QuotesAroundValue=false
RowDelimiter={blank}
ColumnDelimiter={blank}

  1. Thats it … now run the data flow and you will see a new JSON or XML file created