- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Load local and incremental files
I want to create a pipeline where I need to upload some CSV files to a lakehouse. After the first load, only new files that are in the folder should be uploaded. At the moment I have some limitations because my files are local and I did not find the option to read from a folder with a pipeline. Is there any way to do this in Fabric?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)
You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)
You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.

Helpful resources
Join our Fabric User Panel
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Fabric Monthly Update - June 2025
Check out the June 2025 Fabric update to learn about new features.

User | Count |
---|---|
2 | |
2 | |
1 | |
1 |
User | Count |
---|---|
2 | |
2 | |
2 | |
2 | |
2 |