Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Would like to move Semantic model with data ( Data size is 50 GB, So PBIX isn't the option) from dev workspace to prod workspace. But Couldn't find a way
Hi @Akshatraj_Adig ,
If you had a chance to review the response I shared earlier and it resolved your issue, please consider marking it as the accepted solution it helps others find the answer more quickly.
Thank You.
Hi @Akshatraj_Adig ,
Thank you @Poojara_D12, @andrewsommer for the helpful responses. I completely agree with the points shared. Since the semantic model is 50 GB, exporting via PBIX isn’t feasible. The two supported approaches are Deployment Pipelines, which is recommended and most seamless if both workspaces are on Premium or Fabric capacity, and XMLA based backup and restore using SSMS for more advanced control. Both methods support moving the model along with its data.
I've also attached the official Microsoft documentation below for better understanding
Deployment Pipelines Overview
Overview of Fabric deployment pipelines - Microsoft Fabric | Microsoft Learn
If any of the responses above helped resolve your issue, please consider marking it as the accepted solution to make it easier for others facing a similar problem in the community.
Regards,
Yugandhar.
When you're working with a large semantic model—like your 50 GB dataset—in Power BI, you can't move it using a .pbix file because of size limitations and because large models are typically created and managed directly in the Power BI Service using deployment pipelines or APIs. To move a semantic model with its data from a development workspace to a production workspace, the best practice is to use Power BI Deployment Pipelines, which are designed specifically for promoting content (including semantic models) across dev, test, and prod stages without needing to republish or reprocess data.
However, for this to work, both your dev and prod workspaces must be assigned to Power BI Premium capacity (or Fabric capacity), and the semantic model must be stored in a workspace connected to a deployment pipeline. When you deploy through the pipeline, it can reuse the existing model and partitions, avoiding a full data reload. If you're not using deployment pipelines or need more control (e.g., for custom CI/CD scenarios), you can also automate the process using Power BI REST APIs—specifically the Export and Import APIs for large models—which allow you to export the semantic model from one workspace and import it into another along with its data.
If you’re only using the PBIP structure or don't yet have deployment pipelines set up, you’ll likely hit a roadblock, as PBIP and .pbix deployments don’t carry over data—they only move metadata. So, the solution is to either enable deployment pipelines with Premium capacity or use REST API-based automation to move the model and data efficiently from dev to prod.
Moving a large semantic model (50 GB) from a Dev to a Prod workspace in Power BI without using a PBIX file is necessary here due to dataset size. When working with models this large (typically Direct Lake or Import mode on Fabric), PBIX import/export is not viable due to file size limitations and platform restrictions.
Option 1: Deployment Pipelines
Steps:
Option 2: XMLA Endpoint – Backup and Restore via SSMS
Only valid for Fabric/Premium workspaces with XMLA Read/Write enabled.
Steps:
Please mark this post as a solution if it helps you. Appreciate Kudos.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
58 | |
34 | |
26 | |
24 | |
23 |
User | Count |
---|---|
62 | |
52 | |
30 | |
25 | |
23 |