Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
We are starting to use deployment pipeline to move items between our dev and prod workspace and wanting to understand how data caches.
For example, if we have a dataflow gen 2 that is loading data into a lakehouse, that then creates a number of views with a powerBI report then built off those views.
Can we deploy all of those items from dev to prod together?
Or do we need to deploy the dataflow gen 2 1st > lakehouse > powerBI report?
Thanks
Carl
Solved! Go to Solution.
Hi @CarlBlunck,
Items that are supported by the deployment pipelines can all be moved at the same time. See the documentation for supported items: https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/intro-to-deployment-pipelines?tab...
Also see this great blog post: https://discoveringallthingsanalytics.com/fabric-deployment-pipelines-guide-dynamic-warehouse-connec...
If you found this helpful, cosnider giving Kudos. If I solved your problem or answered your question, mark this post as a solution.
Yes. If you have a dataflow gen2 in say the dev stage, that is reading and writing to a lakehouse in the same stage (in this example dev), and you deploy it to test stage. The dataflow in the test stage will still be reading and writing to the lakehouse in the dev stage. You will have to manually modify the dataflow in the test stage to read/write to the lakehouse in the test stage.
One thing to note, in the target stage, you will need to rebind the dataflow to the lakehouse in the target stage as it will still be pointing to the lakehouse in the source stage. It is kind of the pain, and one of the things still lacking in the CI/CD process for dataflows.
Yes. If you have a dataflow gen2 in say the dev stage, that is reading and writing to a lakehouse in the same stage (in this example dev), and you deploy it to test stage. The dataflow in the test stage will still be reading and writing to the lakehouse in the dev stage. You will have to manually modify the dataflow in the test stage to read/write to the lakehouse in the test stage.
Hi @CarlBlunck ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @tayloramy , Thank you for your prompt response.
Hi @CarlBlunck , Could you please try the proposed solution shared by @tayloramy ? Let us know if you’re still facing the same issue we’ll be happy to assist you further.
Regards,
Dinesh
Hi @CarlBlunck,
Items that are supported by the deployment pipelines can all be moved at the same time. See the documentation for supported items: https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/intro-to-deployment-pipelines?tab...
Also see this great blog post: https://discoveringallthingsanalytics.com/fabric-deployment-pipelines-guide-dynamic-warehouse-connec...
If you found this helpful, cosnider giving Kudos. If I solved your problem or answered your question, mark this post as a solution.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!