Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now

Reply
Anonymous
Not applicable

Can we trigger data pipelines when file is updated in Fabric Lakehouse via events

Hello people,

 

 

I had a requirement my fabric lakehouse will get the data dump everyday from external sources from there on am using data pipelines to use that data for further transformations. Here i want to trigger the data pipelines automatically whenever lakehouse gets updated everyday through events (using event driven architecture just like in azure how we use blob storage events). I did lot of research and found presently there is no way to do that i want to know is this feature is expected in future releases in Fabric Roadmap, 

 

If any one encountered this issue let us know

 

 

5 REPLIES 5
Anonymous
Not applicable

Hi @Anonymous,

Any update on this? Did these suggestion help for your scenario? If that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

Hi @Anonymous 

 

Thanks for your reply, am glad. we can create alerts on workspace items like pipeline, notebook, lakehouse, report etc whenever we create/delete them it triggers an event and sends this to event stream and we can trigger Data pipeline I tried this it will work for workspace items level. But this is not my requirement.

 

My requirement is we are receiving data daily to lakehouse in Files section in form of csv files, so when we receive new csv files everyday data pipeline should be triggered automatically maybe listening to Lakehouse events, I am not sure this feature is there or not in fabric Have you tried this approach am looking for this solution

Anonymous
Not applicable

Hi @Anonymous ,

After I double check on these, I found currently it seem not include correspond type of condition to invoke pipeline.
For this scenario, I'd like to suggest you submit idea to add these types of conditions in aleart or add REST API support to run/cancel data pipeline.

Microsoft Fabric Ideas

Regards,

Xiaoxin Sheng

FabianSchut
Solution Sage
Solution Sage

Hi, it is indeed not possible to create an event based on a Lakehouse update. Here's an idea on Microsoft for it you can vote: https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=3a8a68e8-2c8a-ef11-9443-6045bdafcead.

This is another topic with the same request:
https://community.fabric.microsoft.com/t5/Data-Engineering/Event-Trigger-to-execute-pipeline-when-a-...

Someone posted a workaround, it may work for you: https://community.fabric.microsoft.com/t5/Data-Warehouse/Onelake-trigger-to-Ingest-CSV/m-p/4120366

Anonymous
Not applicable

HI @Anonymous,

You can tried to create 'workspace item events' to 'set alert' to trigger and run the data pipeline:

1.png
Set alerts based on Fabric events in Real-Time hub - Microsoft Fabric | Microsoft Learn

BTW, since your requirement is more related to real-time intelligence, you can also post ot that sub forum to get further support.

Eventstream - Microsoft Fabric Community

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

February Fabric Update Carousel

Fabric Monthly Update - February 2026

Check out the February 2026 Fabric update to learn about new features.

Top Kudoed Authors