Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
In Microsoft Fabric, capacities provide the compute power for a comprehensive suite of analytics tools. A single capacity supports all Fabric experiences, including Power BI. Microsoft Fabric offers a robust set of analytics experiences designed to work together seamlessly - it is integrated, secured, and governed. Additionally, there is flexibility with pay-as-you-go or reservation purchase options, and you can get started with a 60-day free trial (Getting Started | Microsoft Fabric).
There are many features that make capacities flexible for performance and cost management, such as bursting and smoothing:
Even with these features providing scalability and optimizing performance, there may still be a need to scale a Fabric capacity to avoid throttling.
When planning your Fabric capacity strategy, it is important to understand how consumption is calculated.
There are out-of-the-box monitoring tools available such as the Fabric Capacity Metrics App, Surge Protection (Preview), and Workspace Monitoring (Preview).
One great feature of Fabric SKUs is the ability to seamlessly scale and pause/resume within the Azure Portal. However, it is important to note that there are specific SKU limits and guardrails for Power BI workloads that need to be considered when planning a capacity strategy.
My colleague Pat Mahoney has an excellent YouTube playlist about all things Fabric Capacities, including topics on Capacity Metrics App, automating pause/resume, dynamic scaling, and multi-capacity strategies to protect your workloads. I highly recommend checking these out!
Testing for import models was performed on F16 and F8 SKUs.
Checked Total Size in Memory with Vertipaq Analyzer in DAX Studio and confirmed it to be 3.04 GB.
F16 SKU Guardrails:
The Max Memory limit for F16 SKU is 5 GB.
The report works:
F8 SKU Guardrails:
The Max Memory limit for F16 SKU is 3 GB.
Initially, the report still works:
However, after a few minutes the report fails to load with the “exceeds the maximum size limit on disk” error:
Attempting to refresh the model, the refresh fails with the same error:
After scaling up to F16 – the report works again, no model refresh required:
Testing for Direct Lake models was performed on F64 and F32 SKUs.
It was confirmed via Lakehouse SQL Endpoint query that the largest table in the model has a row count of 301,337,872:
Fallback is disabled – model will not fall back to DirectQuery when guardrails are exceeded:
Automatic Reframing is disabled:
F64 SKU Guardrails:
The Direct Lake rows per table limit is 1.5 billion.
The report works:
F32 SKU Guardrails:
The Direct Lake rows per table limit is 300 million.
Report fails to load - if storage Direct Lake Behavior setting had been set to automatic, the report would have loaded but fallen back to DirectQuery:
Note, some limitations regarding licensing:
After scaling up to F64 – the report works again, no reframing required:
For Fabric and Power BI Premium, each SKU has specific limits and other constraints specific to Power BI workloads. When scaling a Fabric capacity, it is important to keep these guardrails in mind and ensure they are not exceeded so that Power BI reports remain accessible to business users. More details can be found here.
The behavior of Import and Direct Lake semantic models and reports during capacity scaling was tested:
Fabric Capacities YouTube playlist:
Capacities:
Surge Protection (Preview):
Workspace Monitoring (Preview):
Direct Lake:
Power BI SKU Guardrails:
DAX Studio:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.