If you like to read data stored in Azure Blob Storage Gen 2 using SSIS then you can use ZappySys SSIS Azure Storage Source for CSV , JSON or XML format.
Useful SSIS Components to read / write data from Azure Data Lake Storage Gen1 or Gen2
Azure Blob CSV File Source | |
---|---|
Azure Blob CSV File Destination | |
Azure Blob XML File Source | |
Azure Blob JSON File Source |
It can do pretty nice things which you wont find in other components such as
- Read GZip compressed files
- Read multiple files using wildcard pattern
- Recursive Scan (Read many files from deep folder structure)
- Intelligent Metadata scan
Read the below article to see how to read files from Azure Storage using SSIS
Verifying if your Azure Storage Account is Gen 2 or Gen 1
Open Azure Storage Explorer and see Account Kind. If it says StorageV2 means its Gen2 and if says Storage then Gen1
Loading data from Azure Data Lake Storage Gen 2 using SSIS (CSV / XML / JSON Files)
Write data to Azure Data Storage using SSIS Azure Blob Destination (CSV)
Here is the component which can write data to Azure Blog Storage in CSV Format
Azure Blob CSV File Destination
Write data to Azure Data Storage using SSIS Azure Blob Destination (XML or JSON)
If you like to write data to XML /JSON Format then you can try some workaround like below.
- First Generate JSON or XML using JSON Generator Transform or XML Generator Transform.
- Then use Azure Blob CSV Destination
- Select Runtime Connection
- Select Input Column from JSON / XML Generator
- Set following Properties
FilePath = some-path (E.g. /myfolder/myfile.json)
FirstRowHasColumnNames=false
QuotesAroundValue=false
RowDelimiter={blank}
ColumnDelimiter={blank}
- Thats it … now run the data flow and you will see a new JSON or XML file created