Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
hi,
We were using PBI embedded with A1 capacity.
it was showing error message
consumed memory 1031 MB, memory limit 1024 MB
Since the application throw error, we changed the capacity to A2.
Now it has started showing error message
consumed memory 2122 MB, memory limit 2048 MB.
Why microsoft not showing correct capacity limit and usage
Also we have 40 million records coming to a size of 13 gb
What is the capacity which we may need to look for to render the report without any memory error popping up in the application rendering the report.
Solved! Go to Solution.
Hi @lijuthn ,
It cannot calculate the size of a portion that is too much out of memory. ABF format SKU is suitable for small semantic models, for large semantic models, I recommend you upgrade to premium capacity, at least F64. With Premium capacities, large semantic models beyond the default limit(10GB) can be enabled with the Large semantic model storage format setting.
Links:
How to configure workloads in Power BI Premium - Power BI | Microsoft Learn
Large semantic models in Power BI Premium - Power BI | Microsoft Learn
The specific capacity requirements depend on the complexity of the report, the number of concurrent users, and other workloads. Here are some documents for your reference:
Microsoft Fabric concepts - Microsoft Fabric | Microsoft Learn
How to configure workloads in Power BI Premium - Power BI | Microsoft Learn
If your reports are very complex or have a large number of concurrent users, you may need a higher SKU. You can use Power BI's capacity planning tool to estimate your specific needs.
Best regards,
Mengmeng Li
Hi @lijuthn ,
P1 capacity can accomodate 13 gb. P1 capacity can take upto 25 gb memory, but please note that while refreshing the dataset, memory consumption by dataset gets doubled. So would recommend incremental refresh.
If you have worked on all the optimization technique on the data modelling and still doesn't go below 13 gb, then you might get memory error often. Then need to switch to P2 cspacity.
https://learn.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction
Thanks,
Pallavi
Hi @lijuthn ,
It cannot calculate the size of a portion that is too much out of memory. ABF format SKU is suitable for small semantic models, for large semantic models, I recommend you upgrade to premium capacity, at least F64. With Premium capacities, large semantic models beyond the default limit(10GB) can be enabled with the Large semantic model storage format setting.
Links:
How to configure workloads in Power BI Premium - Power BI | Microsoft Learn
Large semantic models in Power BI Premium - Power BI | Microsoft Learn
The specific capacity requirements depend on the complexity of the report, the number of concurrent users, and other workloads. Here are some documents for your reference:
Microsoft Fabric concepts - Microsoft Fabric | Microsoft Learn
How to configure workloads in Power BI Premium - Power BI | Microsoft Learn
If your reports are very complex or have a large number of concurrent users, you may need a higher SKU. You can use Power BI's capacity planning tool to estimate your specific needs.
Best regards,
Mengmeng Li
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Power BI update to learn about new features.