March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
We are try to run a fabric workflow using Tidal Scheduler or possible have Fabric run itself once a file is sent to a specific directory. Basically we need to automate using a file dependency.
Solved! Go to Solution.
@wwwricdep - Antoher thing you could do is use Tidal to push the On-Prem file to Azure Blob Storage as it looks like Tidal has connectors for that. Then setup the Fabric Pipeline Trigger based on the file getting created in Azure Blob Storage. Then it can dip into your On-Prem environment to do the rest of the work.
It does look like the API Call that MSFT reference in this post would/does work for Pipelines. I found another article with an example of how to use it. I'm guessing tidal would make it easier to make that API call as well.
@wwwricdep - Are you looking to kick-off the newly released Fabric Workflow functionality or when you say Fabric Workflow are you referring to Fabric Data Pipelines?
If it is the Data Pipeline side, the Blob Triggers functionality was released so you can drop a "trigger" file into Blob Storage and let it kick off your pipeline. I'm using DAG's in the Notebooks called by my pipeline to run all my various notebooks to process the data.
Must admit we are fairly new on the fabric world and a lot of these ideas are foreign to us. What we really wanted to do is have fabric begin processing a pipeline upon the arrival of a text file that we get from our stock trading system. The file gets created on PREM and then we need to send it to FABRIC for power gen2 dataflows. Our problem is getting the file over and triggering the pipe. What we do as a workaround is to the CopyData object to get the file from on PREM to the fabric lakehouse and then allow for enough time to have fabric's scheduler run the pipeline and ingest the text file from the lakehouse. We would have loved being able to call the pipeline from our on prem scheduler or failing that use the trigger to start the pipeline once the trade text file was present.
Hi @wwwricdep ,
I understand your desire to automate the pipeline execution upon receiving the stock trading text file. While Fabric Data Pipelines doesn't currently support direct triggering from on-premises schedulers.
Currently we only support Azure Blob Storage event triggers.
Only workaround is to rely on Fabric's scheduler to pick it up later.
Docs to refer -
Data pipelines storage event triggers in Data Factory (Preview) - Microsoft Fabric | Microsoft Learn
However, your suggestion is definitely valuable! We use customer feedback like yours to prioritize future features. The more users who request the ability to customize backgrounds, the higher it moves on our list.
Appreciate if you could share the feedback on our Microsoft Fabric Ideas. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
I hope this information helps. If you have any further queries please do let us know.
thanks for the honest reply. Given that we cannot trigger from an on prem file; the other question would be triggering the pipeline from our on prem trigger. When we had ssis, on prem, we'd be able to compile the code and call the vbs from tidal. That would faciliate the triggering of running the process when a file came in. We were hoping that calling the pipeline would be the same. Not sure if you opened up the api call from on prem schedulers yet?
@wwwricdep - Antoher thing you could do is use Tidal to push the On-Prem file to Azure Blob Storage as it looks like Tidal has connectors for that. Then setup the Fabric Pipeline Trigger based on the file getting created in Azure Blob Storage. Then it can dip into your On-Prem environment to do the rest of the work.
It does look like the API Call that MSFT reference in this post would/does work for Pipelines. I found another article with an example of how to use it. I'm guessing tidal would make it easier to make that API call as well.
Hi @wwwricdep ,
Glad to know that query got resolved. Please continue using Fabric Community on your further queries.
Hi @wwwricdep ,
Did you got a chance to look into this?
Job Scheduler - Run On Demand Item Job - REST API (Core) | Microsoft Learn
Probably you can have a try with this Job Scheduler API.
Hi @wwwricdep ,
Thanks for using Fabric Community.
At this time, we are reaching out to the internal team to get some help on this .
We will update you once we hear back from them.
Hi @wwwricdep ,
At present in Fabric Data WorkFlows there is no event scheduler in built option.
Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
Docs to refer -
Introducing Data workflows in Microsoft Fabric | Microsoft Fabric Blog | Microsoft Fabric
What are Data Workflows? - Microsoft Fabric | Microsoft Learn
If you are looking for pipeline event based trigger, you can refer here - Data pipelines storage event triggers in Data Factory (Preview) - Microsoft Fabric | Microsoft Learn
Hope this helps. Please let me know if you have any further queries.
Hi @wwwricdep ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
User | Count |
---|---|
6 | |
2 | |
2 | |
1 | |
1 |
User | Count |
---|---|
12 | |
3 | |
3 | |
2 | |
2 |