Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.

Reply
rvp_ordix
Helper I
Helper I

Different Size on Disk for instances of same Import Semantic Model

We have two instances of an Import Semantic Model, one for our Production environment and one for our Dev environment. I have recently noticed that our Prod model takes up more than twice the size of the Dev model, despite both being connected to the same data source. This is also not a case of the Dev model having more tables. The only things we create/edit are Measures, due to the peculiarities of our architecture.

 

As a test, I have published a bare version of that model to a new Fabric Workspace and refreshed it there. To our surprise, this new model took less than 10% of the Prod model's space. 

 

To be clear, I am not talking about space in memory, but the value you see when opening the "Manage group storage"-option under Settings. 

 

Where would the disparity come from? Some kind of history might have seemed plausible, but that shouldn't happen for a model in Import Mode. At least to my understanding.

1 ACCEPTED SOLUTION
GilbertQ
Super User
Super User

Hi @rvp_ordix 

 

What I would recommend doing is if your semantic model is in a premium workspace is to connect to it using the XMLA endpoints with SQL Server Management Studio. Right click on the database and then go into properties to see the size or alternatively you can use DAX studio to connect to the semantic model and run the view metrics to see the size of the model. I'll. compare that to make sure that it matches what you're expecting. It could just be a bug in the power BI service. Alternatively, if the model is that large in size, I would recommend doing a defrag on the entire model to see if that reduces the size.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

View solution in original post

7 REPLIES 7
v-nmadadi-msft
Community Support
Community Support

Hi @rvp_ordix ,

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the community members for the issue worked. If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Thanks and regards

v-nmadadi-msft
Community Support
Community Support

Hi @rvp_ordix ,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

Thank  you

v-nmadadi-msft
Community Support
Community Support

Hi @rvp_ordix  ,
Thanks for reaching out to the Microsoft fabric community forum.

In addition to the valuable points mentioned by @GilbertQ   , as you have stated that both the semantic models contain the same number of tables and the only thing you create/edit are measures, you may consider this point.
Calculated columns consume additional memory since their values are stored directly in the table. When multiple calculated columns with complex formulas are added, they can significantly impact performance and increase the overall size of the data model.

If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS.
Thanks and Regards

I don't think we use much in the way of calculated columns, though it's at least possible that someone added something by accident. I'll check that, when I have time.

Hi @rvp_ordix,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

GilbertQ
Super User
Super User

Hi @rvp_ordix 

 

What I would recommend doing is if your semantic model is in a premium workspace is to connect to it using the XMLA endpoints with SQL Server Management Studio. Right click on the database and then go into properties to see the size or alternatively you can use DAX studio to connect to the semantic model and run the view metrics to see the size of the model. I'll. compare that to make sure that it matches what you're expecting. It could just be a bug in the power BI service. Alternatively, if the model is that large in size, I would recommend doing a defrag on the entire model to see if that reduces the size.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

DAX metrics support the sizes shown in my browser, so that's not it. 

 

How would I go about defragmenting a Semantic Model. Is there an API to do that? 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!

FebPBI_Carousel

Power BI Monthly Update - February 2025

Check out the February 2025 Power BI update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.