Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
dpiech10
New Member

Deployment Pipeline moves Lakehouse but nothing inside the Lakehouse

I have a Lakehouse, semantic model, endpoint, etc in a dev workspace and I am trying to use the new Deployment Pipelines interface to move a copy to a "gold" production workspace. I have gone through the motions of setting up and triggering a deployment several times and it seems like it is succeeding, but when I go to open up the new prod version of my lakehouse none of the shorcutted tables are present, the prod semantic model is empty. Basically it seems like the structure of the workspace is properly being copied over but none of the content.

 

Is this behavior intentional and I am misunderstanding the purpose of the deployment tool? Or am I doing something wrong that's leading to these results? There aren't many options available in the simple deployment pipeline interface so I don't really see what I could be missing.

 

Thanks

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @dpiech10 

 

As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred. 

 

Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community

 

My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.

 

From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment. 

 

Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content. 

 

Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.

Lakehouse Shortcuts metadata on git and deployment pipelines 

 

You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:

Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric

Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines

Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

View solution in original post

1 REPLY 1
Anonymous
Not applicable

Hi @dpiech10 

 

As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred. 

 

Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community

 

My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.

 

From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment. 

 

Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content. 

 

Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.

Lakehouse Shortcuts metadata on git and deployment pipelines 

 

You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:

Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric

Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines

Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.