Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
lucianomagalhae
New Member

Load local and incremental files

I want to create a pipeline where I need to upload some CSV files to a lakehouse. After the first load, only new files that are in the folder should be uploaded. At the moment I have some limitations because my files are local and I did not find the option to read from a folder with a pipeline. Is there any way to do this in Fabric?

1 ACCEPTED SOLUTION
ajarora
Microsoft Employee
Microsoft Employee

Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)

You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.

View solution in original post

1 REPLY 1
ajarora
Microsoft Employee
Microsoft Employee

Pipelines dont support connecting to data onpremise or on your local machines. Currently the data must be available on cloud, in one of the supported data stores (like azure blob or adls gen2 etc)

You can setup azcopy process on your desktop to upload this incremental data to an azure blob (check azcopy sync). And then load it into LH using pipelines copy activity.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors