The button “Test connection” will show a success message in case if credential has full access to a storage account. In case if credential can access only some certain subfolder the button will keep prompting about connectivity error, however the dataflow will still able to write data correctly to such directory in a data lake.
Another logical step: configure a destination task:
There are four different formats available:
The last two require java libraries installed on a SSIS server.
SSIS starts writing data to a data lake the same way as it written to a local disk storage
Azure Storage Explorer shows that a parquet file was generated: