Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Cyferki
Frequent Visitor

Dataset size issue

Hello,
I have some issues with Datasets' size for Premium Capacity P2.
I'm using Import mode and I got information that I reached my capacity 100gb.
But my Workspace settings show diffrent object size than Fabric Capacity Metrics (FCM).

The difference is quite big and doesn's make sense for me.
For example Workspaces and Datasets settings show 5700 mb for Sign Off dataset but FCM show 19.5 Gb.
Values in FCM seems to be correct because in total is shows 100gb but also it's weird as dataset max can be 10gb.
Does anyone know what i;m missing here?
Thank you!

Cyferki_0-1727187295753.png

 

4 REPLIES 4
Cyferki
Frequent Visitor

@Tutu_in_YYC 
I got answer from Microsoft and according to them FCM shows uncompressed dataset size. It makes sense but also I;m confused that we have to plan our capacity based on uncompressed size instead of compressed

That's good information. Based on microsoft doc on Import mode, my guess is we have to plan based on compressed dataset size. Capacity determines the memory, and power bi model that is loaded into memory is compressed (based on the doc). I will try to confirm this with my team mates.

Furthermore on capacity planning, memory is needed to host the model and also other operations. So if you model is eg 4gb, you probably need to get capacity with 10gb (instead of 5gb), eg as the model size might double (for import mode) during dataset refresh + other operations like querying/running subscriptions etc 

Re: Capacity Planning

 

After a few discussion with SMEs on this topic. We concluded that the size shown in FCM would be the best value to use in capacity planning, but should not be the only thing to be considered i.e. we also need to consider the operations ( background and interactive operations query, refresh, export to pdf, run subscription etc) that also uses the memory provided the a chosen capacity.

 

Also we need to note that when it comes to limit of a capacity, there are 2 to be aware of:

1. memory limit per semantic model

2. total capacity usage limit on all fabric item

Tutu_in_YYC
Super User
Super User

Max for P2 is actually 50gb. But i will not be surprised if you were able to use up to 100GB because of autoscale. 
(https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is#semantic-model-sku-lim...).

I cant 100% confirm why FCM shows 19GB where as its 5700 mb in the workspace, but my educated guess is it has to do with the dataset being offline vs when it is being used. When it is offline, certain parts of the memory is evicted so the size is smaller, but when it is online ie it is being queried, is it loaded into memory where the size now is increased. Check this blog out, it explains more about the different components that makes up the size.


 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.