Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedJoin us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now
Hi, hopefully somebody could advice me on the following. For a manual process we export a csv file and place it in a folder in our onelake directory in explorer. We have created a pipeline in Fabric that appends the data of the csv to a table in onelake. My question is whether its possible to trigger the pipeline when the csv is placed in the folder?
Many thanks in advance,
Michiel Soede
Solved! Go to Solution.
As a solution I created data flow in Fabric to append data to a table in the Onelake and I trigger it via Power Automate "when a file is created" in a specific SharePoint folder.
As a solution I created data flow in Fabric to append data to a table in the Onelake and I trigger it via Power Automate "when a file is created" in a specific SharePoint folder.
Nice, interesting solution!
If I understand correctly:
The file is first uploaded to SharePoint, this triggers Power Automate, then Power Automate triggers the Dataflow Gen2, the Dataflow Gen2 connects to the file in SharePoint and writes the content to a Lakehouse table in Fabric.
SharePoint -> DFg2 -> Lakehouse Table
And the process is orchestrated by Power Automate.
How do you trigger a Dataflow Gen2 from Power Automate? (Which connector?)
Refresh a dataflow
Hi, thanks, I bumped into those triggers for blob storage events but was not sure if that could be used in the scenario in which we drop csv files on a specific location.
Should you then first drop the csv file in a blob container before such a trigger can be used?
thanks
Well the trigger will react to events in the storage account/onelake, so you can setup the trigger first then copy/move/create CVS files in the relevant location. The trigger should fire when the fire arrives.
----------------------------------------------------------
If my reply has been useful please consider providing
kudos and marking as the solution for others to find
----------------------------------------------------------
Hi @AndyDDC
My understanding was this event trigger is only for Azure blob storages and not one lake.
Am I missing anything here? Has the support for event trigger via One Lake file section been enabled?
Ah no...OneLake not activated yet, that's annoying. So only Azure Data Lake Gen2 accounts are supported.
Okay thanks for the information
Hi @Msoede
As the feature is not available currently, you could suggest a new idea in Fabric Ideas forum to inform the product team of your requirement. As of now I don't find similar ideas there.
Best Regards,
Jing
Hi @Msoede you should be able to use event triggers in your pipeline to do this. Please see this video and hopefully this fits with what you need: Microsoft Fabric: Blob Storage Event Triggers in Data Factory Pipelines (youtube.com)
----------------------------------------------------------
If my reply has been useful please consider providing
kudos and marking as the solution for others to find
----------------------------------------------------------
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the February 2025 Fabric update to learn about new features.
User | Count |
---|---|
4 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
6 | |
4 | |
4 | |
2 | |
2 |