March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
Register NowGet certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hello,
First, a little background on what I am trying to accomplish. We created a Workspace to land all of our SAP data (tables, CDS views, extractors) and it is organized in the bronze/silver/gold medallion archatecture. The SAP data is being extracted using a 3rd party software and is landing in the Files section of the bronze lakehouse under the subfolder "in", with each SAP table getting its own folder ex. Files/in/LFA1.
I have a pipeline that finds all child folders of "in", and passes the folder names as a parameter to a notebook which handles the merge into our silver lakehouse as delta tables.
I would like to have this process triggered by an event as opposed to scheduling the pipeline. I created a Fabric Workspace Item Event as shown below, but when I click create I get the error message "Request failed with status code 500."
Did I setup the alert incorrectly, or is there an issue with Data Activator? Any suggestions on the configuration of the alert or our design in general would be greatly appreciated.
Solved! Go to Solution.
Hi @rjb2232 ,
If you can get the total number of folders in your specified lakehouse via a method such as measure (e.g. if you have a datasheet dedicated to dynamically logging all the folders), you can create a Reflex for this measure, which will be triggered when this measure changes. If you can't implement this method, I found the following link It may help you to know that this method requires the use of a Data Factory:
Azure Data Factory: Storage event trigger only on new files - Stack Overflow
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @rjb2232 ,
Error code 500 generally refers to an internal error, not an error in your operation.
I noticed that someone else has already reported the ICM and our engineers have noticed the issue and confirmed that it is a known issue, they are doing their best to resolve the issue, please be patient and I will let you know as soon as there is any progress.
Best Regards,
Dino Tao
Hi @v-junyant-msft,
Do you have any news on this internal error? I seem to be getting the same error when trying to add Fabric Workspace Item Events as a source to my Reflex.
Thanks in advance,
BR,
-Christian
Thanks @v-junyant-msft glad to hear that its a known issue and is being worked on.
Once this is resolved do you think my setup will accomplish what I am trying to do? Will event type Microsoft.Fabric.ItemCreateSucceeded with filter data.itemKind StringContains Folder trigger the alert when a new folder is added to Files/in in my lakehouse? Also, is there a way to set this alert for only a specified lakehouse, or will it trigger whenever a folder is created anywhere in the workspace?
Thanks
Hi @rjb2232 ,
The “Item” in “Microsoft.Fabric.ItemCreateSucceeded” does not refer to a physical item such as a folder, but rather to the item that you created in the Fabric Event The “Item” in “Fabric.ItemCreateSucceeded” does not refer to an entity item such as a folder, but rather to an action you perform on a resource in a Fabric Event, such as creating a new artifact, which is not triggered by adding a new folder to the data source (i.e. lakehouse).
Best Regards,
Dino Tao
Thanks for the clarification. Any recommendation on how I could trigger a pipeline based on a new folder being added to the files section of a lakehouse?
Hi @rjb2232 ,
If you can get the total number of folders in your specified lakehouse via a method such as measure (e.g. if you have a datasheet dedicated to dynamically logging all the folders), you can create a Reflex for this measure, which will be triggered when this measure changes. If you can't implement this method, I found the following link It may help you to know that this method requires the use of a Data Factory:
Azure Data Factory: Storage event trigger only on new files - Stack Overflow
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I'm having the same issue with Azure Blob Storage events. Also tried the Fabric workspace item and got the 500 error there too. Is this an issue affecting creation of all event triggers/alerts?
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
Check out the October 2024 Fabric update to learn about new features.