Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
dpiech10
New Member

Deployment Pipeline moves Lakehouse but nothing inside the Lakehouse

I have a Lakehouse, semantic model, endpoint, etc in a dev workspace and I am trying to use the new Deployment Pipelines interface to move a copy to a "gold" production workspace. I have gone through the motions of setting up and triggering a deployment several times and it seems like it is succeeding, but when I go to open up the new prod version of my lakehouse none of the shorcutted tables are present, the prod semantic model is empty. Basically it seems like the structure of the workspace is properly being copied over but none of the content.

 

Is this behavior intentional and I am misunderstanding the purpose of the deployment tool? Or am I doing something wrong that's leading to these results? There aren't many options available in the simple deployment pipeline interface so I don't really see what I could be missing.

 

Thanks

1 ACCEPTED SOLUTION
v-jingzhan-msft
Community Support
Community Support

Hi @dpiech10 

 

As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred. 

 

Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community

 

My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.

 

From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment. 

 

Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content. 

 

Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.

Lakehouse Shortcuts metadata on git and deployment pipelines 

 

You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:

Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric

Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines

Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

View solution in original post

1 REPLY 1
v-jingzhan-msft
Community Support
Community Support

Hi @dpiech10 

 

As of now, the deployment pipeline does not fully support copying the content of Lakehouses, such as tables and files, to the destination workspace. This means that while the structure of the lakehouses and warehouses is copied, the actual data and content are not automatically transferred. 

 

Here is a similar thread for your reference: Deployment pipeline and lakehouse content - Microsoft Fabric Community

 

My understanding is that the data used tends to be different at different stages. For example, development stage uses a small amount of sample data. Testing stage uses test data which simulates production data. And production stage uses real data. Therefore, the current deployment is mainly focused on metadata such as the schema of the object, structure, definition etc., rather than moving all the actual data and files contained within them.

 

From the Lakehouse in deployment pipelines document, a new empty Lakehouse object with same name is created in the target workspace after the first deployment. 

 

Then to handle the data and files, you can use data pipelines, notebooks, or manual processes to ensure that the target environment has the necessary content. 

 

Additionally, engineers are working on improve the deployment pipeline experience in Fabric. Below is a new feature on the Roadmap.

Lakehouse Shortcuts metadata on git and deployment pipelines 

 

You can also vote up or raise new ideas on Fabric's Idea forum. Below are some ideas about features you might want:

Microsoft Idea: Add Tables or shortcuts to Lakehouse Pipeline deployments in Fabric

Microsoft Idea: lakehouse views not synced between worspaces via deployment pipelines

Microsoft Idea: Reflect Metadata of Warehouse/Lakehouse in Deployment Pipeline

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!