Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Hamza_Amir
Frequent Visitor

Event based trigger causing duplicate runs

I have created an event based trigger that will monitor a Fabric lakehouse folder in fabric for any file creation, so i can trigger a pipeline.

 

I have created a notebook through which this file will be created in that folder, its a simple text file and the name of the file is a timestamp which i will use in the pipeline i am triggering.

 

Problem is that when i create the file, the Data Activator detects the new file creation, but it does it twice, which causes the pipeline to trigger twice, even though i am only creating one file in that lakehouse folder.

 

Below is an image attached of the live feed from the Activator view for reference:image.png

 

Anyone ever faced this issue and know how to fix this?

 

Thanks!

1 ACCEPTED SOLUTION

I figured out a workaround for this. Instead of using File creation, I used file deletion event.

The deletion event doesnt cause duplicate runs! and I can still capture the information i need from the file name after deletion.

This also works as an automatic clean up activity as well haha 😁

 

image.png

Thank you @svenchio , @tayloramy , & @v-kpoloju-msft  for your valuable inputs! 🤝 

View solution in original post

10 REPLIES 10
v-kpoloju-msft
Community Support
Community Support

Hi @Hamza_Amir,

Thank you for reaching out to the Microsoft Fabric Community Forum. Also, thanks to @svenchio,@tayloramy,  for those inputs on this thread

Thank you for your patience and for taking the time to validate the notebook logic and share the exact code that really helps narrow things down. Based on everything you have tested, your implementation looks correct and you are not doing anything wrong in the notebook or in the Activator setup.

What you are observing is related to how Microsoft Fabric currently raises One Lake events. Internally, a single logical file write can go through multiple commit steps, and during that process the Microsoft.Fabric.OneLake.FileCreated event can be emitted more than once with identical event details. When this happens, Data Activator detects both events and triggers the pipeline twice, even though only one file was created from a user perspective.

Since this behaviour originates at the platform event level, the most reliable way to handle it is to make the downstream pipeline resilient to duplicate triggers. Common approaches include adding an idempotency check (for example, tracking processed file names/timestamps), validating file stability before processing, or introducing a short delay before execution. These patterns ensure that even if the event fires twice, the file is processed only once.

We understand this can be confusing when you expect a one-to-one mapping between file creation and trigger execution, and we appreciate you raising this scenario. Please let us know which approach fits your use case best, and we’ll be happy to help further.

Hope that clarifies. Let us know if you have any doubts regarding this. We will be happy to help.

Thank you for using the Microsoft Fabric Community Forum.

I figured out a workaround for this. Instead of using File creation, I used file deletion event.

The deletion event doesnt cause duplicate runs! and I can still capture the information i need from the file name after deletion.

This also works as an automatic clean up activity as well haha 😁

 

image.png

Thank you @svenchio , @tayloramy , & @v-kpoloju-msft  for your valuable inputs! 🤝 

Hi @v-kpoloju-msft, Thank you for your reply! 😀
I did try to implement a log table that would check if the pipeline is already running to skip the pipeline. It didnt work because, the trigger fires the duplicate runs simultaneously, so a log table wouldnt work in that scenario.

 

However, i have just figured out a work around, that is to use file deleted event instead, this event doesnt cause duplicate runs! 👍

tayloramy
Community Champion
Community Champion

Hi @Hamza_Amir

 

This is fun. I have to agree with @svenchio that somehow the file must be getting created and then being modified, I wonder if notebookutils first creates an empty file and then writes to it immediately after? 

 

What I would do here is use an EventStream to transform the file event data, then shove it into a KQL database, and have activator sit on top of that. 

 

You can use a Session Window to help pluck out duplicate events and ensure that only unique events end up in the KQL Database. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
svenchio
Solution Sage
Solution Sage

Hi @Hamza_Amir , thinkig about your description, I would like to offer an hyphotesis of what may be happending, since  the Microsoft.Fabric.OneLake.FileCreated event is raised for both file creation and updates, I,m thinking that if your notebook writes the text file and then somehow modify it (e.g., renames, appends, or sets metadata), you’ll get two events quickly . 

 

svenchio_0-1767093603489.png

Ref. Explore OneLake events in Fabric Real-Time hub - Microsoft Fabric | Microsoft Learn 

 

Before exploring some possible solutions, I think it would be worth to prove/discard this hyphotehsis, can you check if your notebook is only creating or, if it's creating and updating, then you know why two events are getting fired... let us know the outcome and move from there 😉 ... all the best! 

 

If you find this info useful, a thumb's up would be nice, and if I'm right, mark this as a solution/explanaition. 

 

 

 

 

 

Hi, Thank you for your reply! I tested your hypothesis and double checked the notebook, but all i am doing is a simple write command once, here is the actual code in the cell where i create the test file:

file_content = (
    f"brand_id={brand_id}\n"
    f"runtime={runtime}\n"
    f"pipeline_run_id={pipeline_run_id}\n"
)

notebookutils.fs.put(file_path, file_content, overwrite=False)

 

Hi @Hamza_Amir , ok, please try overwrite=True, even though you’re creating a new file, forcing overwrite=True can reduce any immediate follow‑up “update” and check if the double event get's trigger. Let us know the outcome. 

Hey @svenchio!, I made this change in the code:

file_content = (
    f"brand_id={brand_id}\n"
    f"runtime={runtime}\n"
    f"pipeline_run_id={pipeline_run_id}\n"
)

notebookutils.fs.put(file_path, file_content, overwrite=True)

 Still getting duplicate runs 😔

tayloramy
Community Champion
Community Champion

Hi @Hamza_Amir

 

Are both records entirely duplicates? Are they both the same event type and status? 

 

Can you show us the activator set up for the file? 

 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Hi, thank you for replying!

 

Yes, all the column values in the live feed area of the alert are exactly the same, even the subject string, here is an image of my activator setup, hope this gives you a better picture of the situation:

image.png

 

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.