Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello people,
I had a requirement my fabric lakehouse will get the data dump everyday from external sources from there on am using data pipelines to use that data for further transformations. Here i want to trigger the data pipelines automatically whenever lakehouse gets updated everyday through events (using event driven architecture just like in azure how we use blob storage events). I did lot of research and found presently there is no way to do that i want to know is this feature is expected in future releases in Fabric Roadmap,
If any one encountered this issue let us know
Hi @pavannarani,
Any update on this? Did these suggestion help for your scenario? If that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.
Regards,
Xiaoxin Sheng
Hi @v-shex-msft
Thanks for your reply, am glad. we can create alerts on workspace items like pipeline, notebook, lakehouse, report etc whenever we create/delete them it triggers an event and sends this to event stream and we can trigger Data pipeline I tried this it will work for workspace items level. But this is not my requirement.
My requirement is we are receiving data daily to lakehouse in Files section in form of csv files, so when we receive new csv files everyday data pipeline should be triggered automatically maybe listening to Lakehouse events, I am not sure this feature is there or not in fabric Have you tried this approach am looking for this solution
Hi @pavannarani ,
After I double check on these, I found currently it seem not include correspond type of condition to invoke pipeline.
For this scenario, I'd like to suggest you submit idea to add these types of conditions in aleart or add REST API support to run/cancel data pipeline.
Regards,
Xiaoxin Sheng
Hi, it is indeed not possible to create an event based on a Lakehouse update. Here's an idea on Microsoft for it you can vote: https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=3a8a68e8-2c8a-ef11-9443-6045bdafcead.
This is another topic with the same request:
https://community.fabric.microsoft.com/t5/Data-Engineering/Event-Trigger-to-execute-pipeline-when-a-...
Someone posted a workaround, it may work for you: https://community.fabric.microsoft.com/t5/Data-Warehouse/Onelake-trigger-to-Ingest-CSV/m-p/4120366
HI @pavannarani,
You can tried to create 'workspace item events' to 'set alert' to trigger and run the data pipeline:
Set alerts based on Fabric events in Real-Time hub - Microsoft Fabric | Microsoft Learn
BTW, since your requirement is more related to real-time intelligence, you can also post ot that sub forum to get further support.
Eventstream - Microsoft Fabric Community
Regards,
Xiaoxin Sheng
User | Count |
---|---|
29 | |
10 | |
4 | |
3 | |
1 |
User | Count |
---|---|
45 | |
15 | |
14 | |
10 | |
9 |