Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Is it possible to trigger a DataPipeline whenever a specific table in the Lakehouse has new rows or when the data within the table changes?
I tried setting up a DataPipeline with a trigger, selecting Microsoft.Fabric.JobEvents.ItemJobSucceeded, and configuring the table name as itemName. I then ran a Dataflow to update the Lakehouse table's state, but the DataPipeline didn't run as expected.
Has anyone successfully set up such a trigger, or is there a different approach I should consider?
Solved! Go to Solution.
Hi @mh20221111,
Thank you @Shahid12523, for your insights.
At present, Fabric does not allow you to trigger a DataPipeline directly based on changes to Lakehouse table data, since tables do not generate events. Triggers are currently supported only for file events, job events, and workspace events. This is why your pipeline did not run when the table was updated. For more information, please refer below link:
Data pipelines event triggers in Data Factory - Microsoft Fabric | Microsoft Learn
Solved: Re: Is it possible to run an event-based trigger b... - Microsoft Fabric Community
Thank you.
Hi @mh20221111,
Currently, Fabric does not allow pipelines to be triggered by the completion of Gen2 dataflows, except through CI/CD or semantic model updates. If this functionality is important for your workflow, you can suggest it on the Microsoft Ideas Forum. If your idea receives enough votes, Microsoft may consider adding it in a future release.
Fabric Ideas - Microsoft Fabric Community
For more information on the currently supported event triggers, you can also refer to the official documentation:
Data pipelines event triggers in Data Factory - Microsoft Fabric | Microsoft Learn
Thank you.
Hi @mh20221111,
Have you had a chance to review the solution we shared earlier? If the issue persists, feel free to reply so we can help further.
Thank you.
Hi @v-saisrao-msft ,
Are there any plans to include the completion of updates to Gen2 data flows outside of CI/CD and semantic models in workspace events in the future?
Hi @mh20221111,
Currently, Fabric does not allow pipelines to be triggered by the completion of Gen2 dataflows, except through CI/CD or semantic model updates. If this functionality is important for your workflow, you can suggest it on the Microsoft Ideas Forum. If your idea receives enough votes, Microsoft may consider adding it in a future release.
Fabric Ideas - Microsoft Fabric Community
For more information on the currently supported event triggers, you can also refer to the official documentation:
Data pipelines event triggers in Data Factory - Microsoft Fabric | Microsoft Learn
Thank you.
Hi @mh20221111,
Thank you @Shahid12523, for your insights.
At present, Fabric does not allow you to trigger a DataPipeline directly based on changes to Lakehouse table data, since tables do not generate events. Triggers are currently supported only for file events, job events, and workspace events. This is why your pipeline did not run when the table was updated. For more information, please refer below link:
Data pipelines event triggers in Data Factory - Microsoft Fabric | Microsoft Learn
Solved: Re: Is it possible to run an event-based trigger b... - Microsoft Fabric Community
Thank you.
Currently, you can’t trigger a DataPipeline directly on Lakehouse table data changes because those events aren’t emitted.
Instead, you can:
- Trigger on file updates feeding the Lakehouse (using storage event triggers).
- Use scheduled pipelines that detect changes via timestamps or delta logic.
- Or trigger pipelines manually after updates via Power Automate or API calls.