Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
I currently have a Fabric workspace using an F4 capacity bought through Azure Portal.
When updating my Power BI's semantic model in the same workspace, I get this error:
"Data source error:Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 226 MB, memory limit 221 MB, database size before command execution 2852 MB."
Two questions:
- What F capacity do i need to upgrade to through Azure in order to make my model refresh work?
- What can I do easily in my Power Bi model/file to reduce the capacity I need?
Thanks!
Solved! Go to Solution.
Hello @useruserhi91
To answer your question to which F SKU you need to upgrade we need to know the model size you are trying to upload.
for sure upgrade is needed based on the error .
for F4:
F4 Capacity Dataset Limit: The maximum offline semantic model size (dataset size) for F4 capacity is 2 GB. This is the hard ceiling for datasets stored in the small storage format.
2. F4 Capacity Query Memory Limit: The automatic query memory limit for F4 capacity is 1 GB per query, which restricts the amount of memory a single query can consume during execution
Here are some pointers which can help you to reduce model size
To reduce Power BI model capacity needs, remove unnecessary data, filter and aggregate at the source, and simplify high-cardinality columns. Use a star schema, disable auto date/time tables, replace calculated columns with measures, and optimize DAX formulas. Implement incremental refresh to process only new data
Here is great article,
please have a read:
https://data-mozart.com/how-to-reduce-your-power-bi-model-size-by-90
if this is helpful please accept the and give kudos
Hi @useruserhi91 ,
Thank you for sharing your query! We appreciate the detailed response provided by @nilendraFabric .In addition to that here few suggestions which may work for you.
Based on the error message, you should consider upgrading to a higher F capacity. For example, F8 or F16 capacities offer higher memory limits and can better accommodate larger models.
Here is the documentation of data reduction techniques for Import modeling which may help you in resolving the issue:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn
If this post was helpful, please give us Kudos and consider marking Accept as solution to assist other members in finding it more easily.
Regards,
Menaka.
Hello @useruserhi91
To answer your question to which F SKU you need to upgrade we need to know the model size you are trying to upload.
for sure upgrade is needed based on the error .
for F4:
F4 Capacity Dataset Limit: The maximum offline semantic model size (dataset size) for F4 capacity is 2 GB. This is the hard ceiling for datasets stored in the small storage format.
2. F4 Capacity Query Memory Limit: The automatic query memory limit for F4 capacity is 1 GB per query, which restricts the amount of memory a single query can consume during execution
Here are some pointers which can help you to reduce model size
To reduce Power BI model capacity needs, remove unnecessary data, filter and aggregate at the source, and simplify high-cardinality columns. Use a star schema, disable auto date/time tables, replace calculated columns with measures, and optimize DAX formulas. Implement incremental refresh to process only new data
Here is great article,
please have a read:
https://data-mozart.com/how-to-reduce-your-power-bi-model-size-by-90
if this is helpful please accept the and give kudos