Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
Akshatraj_Adig
Frequent Visitor

Moving Semantic Model

Would like to move Semantic model with data ( Data size is 50 GB, So PBIX isn't the option) from dev workspace to prod workspace. But Couldn't find a way

4 REPLIES 4
V-yubandi-msft
Community Support
Community Support

Hi @Akshatraj_Adig ,

If you had a chance to review the response I shared earlier and it resolved your issue, please consider marking it as the accepted solution  it helps others find the answer more quickly.

 

Thank You.

V-yubandi-msft
Community Support
Community Support

Hi @Akshatraj_Adig ,

Thank you @Poojara_D12@andrewsommer  for the helpful responses. I completely agree with the points shared. Since the semantic model is 50 GB, exporting via PBIX isn’t feasible. The two supported approaches are  Deployment Pipelines, which is recommended and most seamless if both workspaces are on Premium or Fabric capacity, and XMLA based backup and restore using SSMS for more advanced control. Both methods support moving the model along with its data.

 

I've also attached the official Microsoft documentation below for better understanding
Deployment Pipelines Overview
Overview of Fabric deployment pipelines - Microsoft Fabric | Microsoft Learn

 

If any of the responses above helped resolve your issue, please consider marking it as the accepted solution to make it easier for others facing a similar problem in the community.

 

Regards,

Yugandhar.

Poojara_D12
Super User
Super User

Hi @Akshatraj_Adig 

When you're working with a large semantic model—like your 50 GB dataset—in Power BI, you can't move it using a .pbix file because of size limitations and because large models are typically created and managed directly in the Power BI Service using deployment pipelines or APIs. To move a semantic model with its data from a development workspace to a production workspace, the best practice is to use Power BI Deployment Pipelines, which are designed specifically for promoting content (including semantic models) across dev, test, and prod stages without needing to republish or reprocess data.

 

However, for this to work, both your dev and prod workspaces must be assigned to Power BI Premium capacity (or Fabric capacity), and the semantic model must be stored in a workspace connected to a deployment pipeline. When you deploy through the pipeline, it can reuse the existing model and partitions, avoiding a full data reload. If you're not using deployment pipelines or need more control (e.g., for custom CI/CD scenarios), you can also automate the process using Power BI REST APIs—specifically the Export and Import APIs for large models—which allow you to export the semantic model from one workspace and import it into another along with its data.

 

If you’re only using the PBIP structure or don't yet have deployment pipelines set up, you’ll likely hit a roadblock, as PBIP and .pbix deployments don’t carry over data—they only move metadata. So, the solution is to either enable deployment pipelines with Premium capacity or use REST API-based automation to move the model and data efficiently from dev to prod.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS
andrewsommer
Memorable Member
Memorable Member

Moving a large semantic model (50 GB) from a Dev to a Prod workspace in Power BI without using a PBIX file is necessary here due to dataset size. When working with models this large (typically Direct Lake or Import mode on Fabric), PBIX import/export is not viable due to file size limitations and platform restrictions.

 

Option 1: Deployment Pipelines

  • You must have Power BI Premium or Fabric Capacity.
  • Both workspaces must be assigned to a capacity.
  • Your dataset must be deployed using Power BI datasets, not AS models or third-party tools.

 Steps:

  1. In Power BI Service, create a Deployment Pipeline (Dev > Test > Prod).
  2. Add your Dev workspace to the pipeline as the Development stage.
  3. Assign the Prod workspace to the Production stage.
  4. Run a deployment from Dev to Prod.
    • This will copy dataset + data (unlike PBIX export).
    • You can configure rules to parameterize things like data sources.
  5. Confirm the deployed dataset appears and is functional in the Prod workspace.

 

Option 2: XMLA Endpoint – Backup and Restore via SSMS

Only valid for Fabric/Premium workspaces with XMLA Read/Write enabled.

  • Premium/Fabric Capacity workspace
  • XMLA endpoints must be read/write enabled
  • You need SQL Server Management Studio (SSMS) installed.

Steps:

  1. Connect to powerbi://api.powerbi.com/v1.0/myorg/{workspaceName} in SSMS.
  2. Locate the model in the Dev workspace, right-click > Back Up to a .abf file.
  3. Connect to the Prod workspace via SSMS.
  4. Right-click > Restore > Provide the .abf file and specify a new or existing name.

 

Please mark this post as a solution if it helps you. Appreciate Kudos.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.