Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Ymatole
Frequent Visitor

Lakehouse movement using deployement pipelines

I created deployement pipelines to move object from DEV to PROD. I was able to move lakehouses but not schema's getting moved. Any idea what could be the issue here.I didnt get any error. Movement was showing as successful deployement.

2 ACCEPTED SOLUTIONS
Ritaf1983
Super User
Super User

Hi @Ymatole 

The behavior you’re seeing is expected. Deployment pipelines in Fabric move the Lakehouse item (metadata and settings) but do not migrate the schema or underlying data from the SQL endpoint or OneLake. That’s why the deployment shows as successful, but your tables/schemas don’t appear in PROD.

To move schemas and data you’ll need a separate process:

  • Schema: Export DDL (CREATE TABLE/VIEW, etc.) from DEV and run it in PROD.

  • Data: Copy the Delta/Files from OneLake using a Data Pipeline, Notebook (COPY INTO or Spark CREATE TABLE ... USING DELTA LOCATION ...), or another copy mechanism.

  • Shortcuts: Must be recreated in each environment.

  • Rules: Use Deployment Rules only for connection/config settings, not schema/data.

In short, deployment pipelines handle the item, but schema and data migration require an additional scripted process.

If this post helps, then please consider Accepting it as the solution to help the other members find it more quickl

Regards,
Rita Fainshtein | Microsoft MVP
https://www.linkedin.com/in/rita-fainshtein/
Blog : https://www.madeiradata.com/profile/ritaf/profile

View solution in original post

Hi @Ymatole,
Thank you for posting your query in Microsoft Fabric Community Forum. Also, thanks to @Ritaf1983  ,for those inputs on this thread. 

What you are seeing is expected. Deployment pipeline mainly moves the Lakehouse or Warehouse item, but it doesn’t always bring all schemas or table structures. Sometimes a few may come through, but it is not reliable for full schema movement.

If you want the complete schema and tables in PROD, you will need to move them separately  either by running the DDL (create table scripts), or by copying the data using Data Pipeline/Notebook. Shortcuts also need to be recreated in each environment.

Kindly refer to the below documentation links for better understanding:
 https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-git-deployment-pipelines
https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/intro-to-deployment-pipelines?tab...
https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-schemas
Hope this helps if you have any queries we are  happy to assist you further.
Regards,
Harshitha.

View solution in original post

6 REPLIES 6
Ymatole
Frequent Visitor

I have created the deployement pipelines using guest users and unable to give access to others. Any workaround for same ?

Another issue which I am facing is binding issue. I created one of the pipeline manually in target workspace and I want to bind this item to source item. Not able to do this in compare tab as object is not visible there. How to do binding ?

Hi @Ymatole,

For your first point on guest users:
Deployment pipelines require full Admin permissions at both the pipeline level and the workspace level. Guest (B2B) users are usually limited to view-only access unless your tenant admin has explicitly enabled the setting " Allow external guest users to edt and manage content in the organization." Because of this restriction, pipelines created by a guest cannot typically be shared or managed by others. The recommended workaround is to have an internal licensed account own the pipeline, then grant your team members access through security groups or direct assignment. This ensures full manageability and avoids guest account limitations.

For the binding issue:
Pipelines pair items across stages based on type + name (and path). If you manually created an item in the target workspace and it doesn’t exactly match, it won’t appear in the Compare tab and cannot be bound. To resolve this you can:

Rename the target item to match the source item’s name and type, then unassign and reassign the workspace to force a re-evaluation.

If that still doesn’t work, delete the manually created item in the target stage and deploy it from the source through the pipeline so it’s properly linked.

I hope this information is helpful to you. If you need any further clarification or assistance, please feel free to let me know.
Regards,
Community Support Team.

Ymatole
Frequent Visitor

In my case, it copied schema and table sturucture too for few schema's only. Similarly for warehouse migration , it did copy few scehma's and table structures. Not all got copied.

Hi @Ymatole,
Thank you for posting your query in Microsoft Fabric Community Forum. Also, thanks to @Ritaf1983  ,for those inputs on this thread. 

What you are seeing is expected. Deployment pipeline mainly moves the Lakehouse or Warehouse item, but it doesn’t always bring all schemas or table structures. Sometimes a few may come through, but it is not reliable for full schema movement.

If you want the complete schema and tables in PROD, you will need to move them separately  either by running the DDL (create table scripts), or by copying the data using Data Pipeline/Notebook. Shortcuts also need to be recreated in each environment.

Kindly refer to the below documentation links for better understanding:
 https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-git-deployment-pipelines
https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/intro-to-deployment-pipelines?tab...
https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-schemas
Hope this helps if you have any queries we are  happy to assist you further.
Regards,
Harshitha.

Ritaf1983
Super User
Super User

Hi @Ymatole 

The behavior you’re seeing is expected. Deployment pipelines in Fabric move the Lakehouse item (metadata and settings) but do not migrate the schema or underlying data from the SQL endpoint or OneLake. That’s why the deployment shows as successful, but your tables/schemas don’t appear in PROD.

To move schemas and data you’ll need a separate process:

  • Schema: Export DDL (CREATE TABLE/VIEW, etc.) from DEV and run it in PROD.

  • Data: Copy the Delta/Files from OneLake using a Data Pipeline, Notebook (COPY INTO or Spark CREATE TABLE ... USING DELTA LOCATION ...), or another copy mechanism.

  • Shortcuts: Must be recreated in each environment.

  • Rules: Use Deployment Rules only for connection/config settings, not schema/data.

In short, deployment pipelines handle the item, but schema and data migration require an additional scripted process.

If this post helps, then please consider Accepting it as the solution to help the other members find it more quickl

Regards,
Rita Fainshtein | Microsoft MVP
https://www.linkedin.com/in/rita-fainshtein/
Blog : https://www.madeiradata.com/profile/ritaf/profile
Dus
Frequent Visitor

Simply said: they don't work at all

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.