Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
I want to create a pipeline where I need to upload some CSV files to a lakehouse. After the first load, only new files that are in the folder should be uploaded. At the moment I have some limitations because my files are local and I did not find the option to read from a folder with a pipeline. Is there any way to do this in Fabric?
Solved! Go to Solution.
Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)
You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.
Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)
You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
User | Count |
---|---|
8 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
14 | |
5 | |
4 | |
3 | |
2 |