Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
rohit_motwani
Frequent Visitor

How to execute multiple dataflows(Dataflow Gen2 CI/CD Enabled) dynamically in Fabric?

Hi Community,

I have more than 40 dataflows that I need to refresh/execute. Right now, I have to set them up one by one, which is time-consuming.

Is there a way to dynamically execute all my dataflows together (for example, through a script, API, pipeline, or any other approach), instead of triggering them individually?

Any guidance or best practices on how to manage and schedule multiple dataflows efficiently would be very helpful.

2 ACCEPTED SOLUTIONS

Hello @rohit_motwani ,

Yes, you can trigger dataflows with the REST API without any problems.

https://learn.microsoft.com/en-us/rest/api/power-bi/dataflows/refresh-dataflow

However, I don't think it's the Data Pipeline Orchestration that consumes a lot (it does nothing except trigger dataflows), but data flows are not the fastest elements.

Transferring data directly via a Data Pipeline (copy data), a Python script or other method will be much more efficient.

However, it depends on the context. Dataflows are very practical, but when there is a lot of data, they consume a lot of resources.

Feel free to give me a kudo and accept my answer as the solution if it suits you.


Have a nice day,

Vivien

View solution in original post

Hi @rohit_motwani

In my environment, I just add the dataflow to the pipeline and run everything through the pipeline API. This gives me a more stable system that requires less changes as there is only the one API I am interacting with, instead of also interacting with the notebook and dataflow APIs respectively.  

That being said, there is an API available for dataflows: Dataflows - Refresh Dataflow - REST API (Power BI Power BI REST APIs) | Microsoft Learn

You should be able to implement it much the same way as the pipeline API. 

If you found this useful, consider giving Kudos. If I solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

View solution in original post

6 REPLIES 6
vivien57
Power Participant
Power Participant

Hello @rohit_motwani 

 

You can create a Data Pipeline in which you organise your Dataflows as you wish (sequential, parallel) and then schedule only this Data Pipeline, which will trigger all child Dataflows.

 

https://learn.microsoft.com/en-us/fabric/data-factory/tutorial-dataflows-gen2-pipeline-activity#crea...

 

Feel free to give me a kudo and accept my answer as the solution if it suits you. 

 

Vivien

Hi @vivien57 ,
Thank you for your response. Yes, I’m currently following the approach of organizing Dataflows within a Data Pipeline and scheduling it. However, I was exploring whether there might be an alternative method for execution—ideally one that’s more optimized, since I’ve noticed that sometimes the pipeline gets overloaded and the overall run takes longer.

Do you know if there’s any supported way to execute Dataflows directly (for example, via API or Python) that could help reduce the execution time?

Best regards,
Rohit

Hello @rohit_motwani ,

Yes, you can trigger dataflows with the REST API without any problems.

https://learn.microsoft.com/en-us/rest/api/power-bi/dataflows/refresh-dataflow

However, I don't think it's the Data Pipeline Orchestration that consumes a lot (it does nothing except trigger dataflows), but data flows are not the fastest elements.

Transferring data directly via a Data Pipeline (copy data), a Python script or other method will be much more efficient.

However, it depends on the context. Dataflows are very practical, but when there is a lot of data, they consume a lot of resources.

Feel free to give me a kudo and accept my answer as the solution if it suits you.


Have a nice day,

Vivien

tayloramy
Community Champion
Community Champion

Hi @rohit_motwani

You can execute a pipeline fron a notebook using the API using the following code: 

import sempy.fabric as fabric
from sempy.fabric import FabricRestClient
_client = FabricRestClient()

 start_resp = client.post(f"v1/workspaces/{workspace_id}/items/{item_id}/jobs/instances?jobType=Pipeline")
            run_id = start_resp.headers['Location'].split('/')[-1]
            sleep(15)
            # Wait for visibility
            while client.get(f"v1/workspaces/{workspace_id}/items/{item_id}/jobs/instances/{run_id}").status_code != 200:
                print(f"Waiting for job {job_id} to become visible...")
                sleep(15)


If you found this useful, consider giving Kudos. If I solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Hi @tayloramy ,

Thanks a lot for sharing this! Yes, the code you provided works well for executing a pipeline from a notebook. My query was more around whether a similar approach is possible for executing Dataflows using Python. Could you confirm if Dataflow execution is supported via the API, or if it’s currently limited to Pipelines?

Hi @rohit_motwani

In my environment, I just add the dataflow to the pipeline and run everything through the pipeline API. This gives me a more stable system that requires less changes as there is only the one API I am interacting with, instead of also interacting with the notebook and dataflow APIs respectively.  

That being said, there is an API available for dataflows: Dataflows - Refresh Dataflow - REST API (Power BI Power BI REST APIs) | Microsoft Learn

You should be able to implement it much the same way as the pipeline API. 

If you found this useful, consider giving Kudos. If I solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors