Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I'm working on a simple setup in Microsoft Fabric:
I have a Copy Data pipeline that moves files from DataLake A to DataLake B.
I configured an event trigger that listens for file creation in a folder under DataLake A.
The expectation is: when a new file is created, the pipeline runs once to copy that file.
However, what's happening is:
When I create just one file, the pipeline gets triggered multiple times (4+ executions observed).
There’s no loop in my logic, and the destination is in a completely different DataLake/folder.
I’ve confirmed that the trigger is scoped only to the source folder in DataLake A.
Is this a known issue with event triggers in Fabric?
Or could there be something wrong with my configuration?
Would really appreciate your help in confirming whether this is expected behavior or an issue on my side.
Hi @MuhammedLabeeb,
As we haven’t heard back from you, we wanted to kindly follow up to check if the issue got resolved? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
Hi @MuhammedLabeeb,
single file creation triggers multiple pipeline runs is a known behavior in Microsoft Fabric and Azure Data Factory (ADF) when using storage event triggers. This phenomenon is often due to how Azure Blob Storage and Event Grid handle file creation events.
When a file is uploaded to Azure Data Lake Storage Gen2, especially using tools like Azure Storage Explorer, AzCopy, or programmatic methods, the upload process may involve multiple
steps please refer this doc :https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger?tabs=data-factory
Thanks,
Prashanth Are
MS Fabric community support
Thank you for the update.
Just to clarify, we have only one pipeline configured, but it appears to be getting triggered multiple times for a single file upload.
Additionally, the event-based trigger is scoped specifically to folder A inside the Lakehouse. However, we've observed that the pipeline is also triggered when files are uploaded to other folders such as B, C, or even the root folder.
We would appreciate it if you could kindly confirm whether this is expected behavior. If not, could you please advise if there are any additional settings required to ensure the trigger responds only to events within folder A?
Hi @MuhammedLabeeb,
Specifying a “blob path” with just a file extension near the root may cause an unexpected spike in triggered pipelines. Ensure that your trigger definitions are precise and not too broad, as this can lead to multiple triggers being fired
Can you please refer to below doc related Event-Driven Architecture Through Storage Event Trigger and let me know if this helps resolve your issue?
microsofteur.sharepoint.com/teams/MissionCriticalReadiness/Shared Documents/Forms/AllItems.aspx?id=%...
Thanks,
Prashanth Are
User | Count |
---|---|
13 | |
5 | |
4 | |
3 | |
3 |
User | Count |
---|---|
8 | |
8 | |
7 | |
6 | |
5 |