Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
adigkarth
Frequent Visitor

Fabric trigger invoking twice when there is a incoming file

Hi,

 

Our requirement was to invoke a pipeline whenever there is a blob created in a adls folder.This json file is written from datatbricks using dbutlis.fs.put api.We have created a reflex and chosen blobcreated as an event and in filter we have put the folder name where the json arrives

adigkarth_1-1723630643436.png

 

now the problem we are facing is,when the databricks notebook writes the file into adls,reflex is detecting the same  json file twice and triggering the target pipeline twice with the same file.We tried uploading the same file using adls portal in the same folder and the trigger was invoked only once.Could you please let us know how to resolve this 

 

 

1 ACCEPTED SOLUTION
mikeburek
Advocate II
Advocate II

I haven't set this up in Fabric pipelines yet, but I had a similar issue in a normal Azure Data Factory pipeline.

The issue was with a notebook writing a parquet file.

 

What happened is that the notebook (scala?) would first write an empty file with one API call. Then it would flush the rest of the data to the file with a different API call. To the Azure Data Factory pipeline trigger, this was technically 2 API calls to write a file. Just the first one was empty and the second one had data.

 

In the Azure Data Factory pipeline trigger, there is an option for "Ignore empty file". I do not know if this is an option in the Fabric Pipeline Trigger, but that is what I'd look for first.

 

There was also another situation where the file was supposed to stay empty because it was just a signaling file. In that case, I had to look at the event body of the event that triggered the trigger and look for the specific flush API call, and ignore the open file API call.

View solution in original post

1 REPLY 1
mikeburek
Advocate II
Advocate II

I haven't set this up in Fabric pipelines yet, but I had a similar issue in a normal Azure Data Factory pipeline.

The issue was with a notebook writing a parquet file.

 

What happened is that the notebook (scala?) would first write an empty file with one API call. Then it would flush the rest of the data to the file with a different API call. To the Azure Data Factory pipeline trigger, this was technically 2 API calls to write a file. Just the first one was empty and the second one had data.

 

In the Azure Data Factory pipeline trigger, there is an option for "Ignore empty file". I do not know if this is an option in the Fabric Pipeline Trigger, but that is what I'd look for first.

 

There was also another situation where the file was supposed to stay empty because it was just a signaling file. In that case, I had to look at the event body of the event that triggered the trigger and look for the specific flush API call, and ignore the open file API call.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.