Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I have a Lakehouse, semantic model, endpoint, etc in a dev workspace and I am trying to use the new Deployment Pipelines interface to move a copy to a "gold" production workspace. I have gone through the motions of setting up and triggering a deployment several times and it seems like it is succeeding, but when I go to open up the new prod version of my lakehouse none of the shorcutted tables are present, the prod semantic model is empty. Basically it seems like the structure of the workspace is properly being copied over but none of the content.
Is this behavior intentional and I am misunderstanding the purpose of the deployment tool? Or am I doing something wrong that's leading to these results? There aren't many options available in the simple deployment pipeline interface so I don't really see what I could be missing.
Thanks
Solved! Go to Solution.
Hi @dpiech10
As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred.
Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community
My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.
From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment.
Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content.
Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.
Lakehouse Shortcuts metadata on git and deployment pipelines
You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:
Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric
Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines
Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Hi @dpiech10
As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred.
Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community
My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.
From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment.
Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content.
Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.
Lakehouse Shortcuts metadata on git and deployment pipelines
You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:
Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric
Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines
Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
56 | |
28 | |
18 | |
10 | |
4 |
User | Count |
---|---|
60 | |
50 | |
26 | |
8 | |
6 |