Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Morris98
Regular Visitor

More Elegant Way to Trigger Processing of Files in a Notebook When a New File Is Added

Dear Forum,

I have coded some transformation and processing of CSV files in a notebook. The next step that I would like to add is to automatically process files in the notebook from a lakehouse storage whenever a new CSV file is added to a folder in the lakehouse.

I saw that there is an option for triggers in combination with notebooks with the ability to pass the file name to the notebook as a variable, but unfortunately it seems that this is only supported for Azure blob storage and not lakehouses. 

The only workaround I see to automatically process new files is to schedule the notebook and implement this behavior in python itself, by checking if a new file was added since the last run.

Since I am new to Fabric, I wanted to ask if there is a more elegant way to implement this in Fabric. Is there maybe an option in Pipelines/Dataflows that I missed?

Thank you very much in advance.

Best wishes,
Morris

1 ACCEPTED SOLUTION
NandanHegde
Super User
Super User

Based on my understanding, as of now the event triggers within the data pipelines are w.r.t Blob storage only and not the lakehouse.

As of now there is no way to have event driven framework with lakehouse as the source




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

View solution in original post

2 REPLIES 2
NandanHegde
Super User
Super User

Based on my understanding, as of now the event triggers within the data pipelines are w.r.t Blob storage only and not the lakehouse.

As of now there is no way to have event driven framework with lakehouse as the source




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Thank you for your answer. Okay, then it seems I will have to use the workaround with scheduling.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors