Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Nikhil234
Frequent Visitor

copy files from blob to lakehouse file

Hi,

 

I have a file that goes into blob everyday with a timestamp. I want to use a pipeline to dynamically get the latest file each day and load it to the files section in the lakehouse.  The aim is to get the latest data. overwirte the file in the lakehosue and then use a notbook tp transform the data and load to a single table. 

 

I have done all the other steps but i cant figure out how to get the correct file in the piepline. 

 

I read the documentation on incrementally loading files based on modified date in the copy data tool but this functionality dont seem to be in the fabric pipeline. 

 

How can i do this? 

1 ACCEPTED SOLUTION
NandanHegde
Super User
Super User

The below blog provides details to get the latest file details :

https://datasharkx.wordpress.com/2023/01/05/get-latest-folder-file-detail-from-azure-data-lake-stora...

ADF is similar to data pipelines




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

View solution in original post

3 REPLIES 3
Nikhil234
Frequent Visitor

Hi @NandanHegde. The blog was defintily helpful and i used elements of this in my soluation. Definitly recomend people read it. In the end i got the i used the get meta data - child items of the blob filtering on the modified date for within 3hrs of the pipline run time. i set the candence to allighn with the ingestion time of the daility file meaning i could pick up the file i needed each day. From there i simply put the output in the copy data activity and its been working great 

Anonymous
Not applicable

Hi @Nikhil234 ,

 

Did @NandanHegde  reply solve your problem? If so, please mark it as the correct solution, and point out if the problem persists.

 

Best Regards,
Adamk Kong

 

NandanHegde
Super User
Super User

The below blog provides details to get the latest file details :

https://datasharkx.wordpress.com/2023/01/05/get-latest-folder-file-detail-from-azure-data-lake-stora...

ADF is similar to data pipelines




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

Top Solution Authors