Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi Team,
Could you please confirm if it is possible to trigger a Fabric Pipeline using Azure Logic Apps? If so, could you kindly provide detailed documentation or outline the relevant prerequisites required to fulfil this requirement?
Your assistance would be greatly appreciated.
Solved! Go to Solution.
Hi @Thyagarajulu99 ,
I think you can do the steps below:
1. Start by creating a new Logic App in the Azure portal. You can choose a trigger that suits your scenario, such as an HTTP request, a schedule, or an event.
2. In the Logic App, add an action to call the Fabric Pipeline. This can be done using the HTTP action to make a REST API call to the Fabric Pipeline endpoint.
3. Ensure your Fabric Pipeline is set up to accept triggers from external sources. You might need to configure authentication and permissions to allow the Logic App to trigger the pipeline.
I didn't find an official document that specifically talks about this process, but I think you can check out this topic: Accessing Fabric Lakehouse via Logic App - Microsoft Q&A
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
@Thyagarajulu99 - It is possible to trigger a Fabric Pipeline via Azure Logic Apps or Power Automate but there are a few caveats. We just went throug this last week ourselves with lukewarm results.
The current "Run On Demand" API endpoint does not support a Service Principal. So in order to call it from either of the 2 solutions you have use a Username/Password with a Client Id to generate the Bearer Token and then use that to call the API. You can setup a Client Credential Grant in Logic Apps/Power Automate but you have to use the Username/Password version which means potentially storing Username/Password in the configuration of the App or setting up a connection to Key Vault to get it securely.
In Power Automate there is also a "HTTP Request using Azure Entra AD (preauthorized)" connector where you provide login credentials with the normal OAuth2 pop-up. Then you can call the "On-Demand Job" API BUT that '"On-Demand" endpoint does not return a payload, just a 202 message but the Entra AD connector cannot interpret it so the Flow fails, but the pipeline runs.
We decided to leverage Data Activator and Blob Storage triggers to execute our pipelines as that also gives us bettern logging/traceability from start to finish.
Whilst you don't seem to yet be able to run a fabric pipeline from a Logic App, you can use a Logic App to run a Fabric Dataflow using a standard connector. Power Query Dataflows - Connectors | Microsoft Learn
In my situation, the fabric is Azure based and I 'resume' it to load new data into an Azure based SQL Server database. (This part is done using dataflows). After that, stored procedures need to run and this could be done using a pipeline (the pipeline would run the dataflow and then the stored procedure).
Since:
it is a possible solution to build a logic app to:
It seems a bit messy to move so much of the orchestration of an ETL to Logic Apps but seems to be the only solution at the moment. Peronally I want to ditch our Power Query based ETL which uses the Integration Run Time within Azure Data Factory and switch to fabric.
Keep an eye on the blog in case Microsoft get the missing Service Principal capability in the Fabric API fixed
Microsoft Fabric Blog
Whilst you don't seem to yet be able to run a fabric pipeline from a Logic App, you can use a Logic App to run a Fabric Dataflow using a standard connector. Power Query Dataflows - Connectors | Microsoft Learn
In my situation, the fabric is Azure based and I 'resume' it to load new data into an Azure based SQL Server database. (This part is done using dataflows). After that, stored procedures need to run and this could be done using a pipeline (the pipeline would run the dataflow and then the stored procedure).
Since:
it is a possible solution to build a logic app to:
It seems a bit messy to move so much of the orchestration of an ETL to Logic Apps but seems to be the only solution at the moment. Peronally I want to ditch our Power Query based ETL which uses the Integration Run Time within Azure Data Factory and switch to fabric.
Keep an eye on the blog in case Microsoft get the missing Service Principal capability in the Fabric API fixed
Microsoft Fabric Blog
@Thyagarajulu99 - It is possible to trigger a Fabric Pipeline via Azure Logic Apps or Power Automate but there are a few caveats. We just went throug this last week ourselves with lukewarm results.
The current "Run On Demand" API endpoint does not support a Service Principal. So in order to call it from either of the 2 solutions you have use a Username/Password with a Client Id to generate the Bearer Token and then use that to call the API. You can setup a Client Credential Grant in Logic Apps/Power Automate but you have to use the Username/Password version which means potentially storing Username/Password in the configuration of the App or setting up a connection to Key Vault to get it securely.
In Power Automate there is also a "HTTP Request using Azure Entra AD (preauthorized)" connector where you provide login credentials with the normal OAuth2 pop-up. Then you can call the "On-Demand Job" API BUT that '"On-Demand" endpoint does not return a payload, just a 202 message but the Entra AD connector cannot interpret it so the Flow fails, but the pipeline runs.
We decided to leverage Data Activator and Blob Storage triggers to execute our pipelines as that also gives us bettern logging/traceability from start to finish.
Hi @Thyagarajulu99 ,
I think you can do the steps below:
1. Start by creating a new Logic App in the Azure portal. You can choose a trigger that suits your scenario, such as an HTTP request, a schedule, or an event.
2. In the Logic App, add an action to call the Fabric Pipeline. This can be done using the HTTP action to make a REST API call to the Fabric Pipeline endpoint.
3. Ensure your Fabric Pipeline is set up to accept triggers from external sources. You might need to configure authentication and permissions to allow the Logic App to trigger the pipeline.
I didn't find an official document that specifically talks about this process, but I think you can check out this topic: Accessing Fabric Lakehouse via Logic App - Microsoft Q&A
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I don't have Logic Apps to test, but it might be possible to 'on demand' run a pipeline using an API call from Logic Apps. You'd need to figure out token generation/authentication.
API documentation;
https://learn.microsoft.com/en-us/rest/api/fabric/core/job-scheduler/run-on-demand-item-job?tabs=HTT...
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.