Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hello,
We are transfering from Power BI Premium Per User to Fabric F64.
But my main dataset a tabular model deployed from tabular editor is not refreshing. it gives a out of memory error.
I know the model size of ppu is 100gb and that of f64 is 25gb.
But the out of memory is given at around 9gb.
I enabeld the large model option in the workspace and on the dataset.
why im i not getting the error on 25gb database size but on 9gb database size.
when i deploy the model fresh from tabular editor the inital refresh works fine.
fabric capacity size:
refresh history (the initial load is working fine)
the error im getting:
Can anyone please advice on why the memory limit is not 25gb and how to adjust this?
Regards,
Bart Poelert
Solved! Go to Solution.
Hi, @Bart_Poelert
Thank you a lot for sharing the solution. Here's an explanation of why your model exceeds the 25GB memory limit:
25GB=25600MB. Your semantic model has consumed 16 GB (16,571 MB) and your command operations have consumed 9,031 MB before your command starts. Total consumption: 16,571 + 9,031 = 25,602 MB. As a result, an error is reported.
The effective memory limit for a command is calculated based on the amount of memory allowed by the semantic model in terms of capacity (25 GB, 50 GB, 100 GB) and the amount of memory already consumed by the semantic model when the command starts executing. For example, using a semantic model of 12 GB on P1 capacity allows the effective memory limit for new commands to be 13 GB.
You can use the following XMLA property value command to adjust the size of the effective memory:
<PropertyList>
...
<DbpropMsmdRequestMemoryLimit>...</DbpropMsmdRequestMemoryLimit>
...
</PropertyList>
You can click on the link below to learn about the above:
Troubleshoot XMLA endpoint connectivity in Power BI - Power BI | Microsoft Learn
DbpropMsmdRequestMemoryLimit Element (XMLA) | Microsoft Learn
How to Get Your Question Answered Quickly
Best Regards
Jianpeng Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, @Bart_Poelert
Thank you a lot for sharing the solution. Here's an explanation of why your model exceeds the 25GB memory limit:
25GB=25600MB. Your semantic model has consumed 16 GB (16,571 MB) and your command operations have consumed 9,031 MB before your command starts. Total consumption: 16,571 + 9,031 = 25,602 MB. As a result, an error is reported.
The effective memory limit for a command is calculated based on the amount of memory allowed by the semantic model in terms of capacity (25 GB, 50 GB, 100 GB) and the amount of memory already consumed by the semantic model when the command starts executing. For example, using a semantic model of 12 GB on P1 capacity allows the effective memory limit for new commands to be 13 GB.
You can use the following XMLA property value command to adjust the size of the effective memory:
<PropertyList>
...
<DbpropMsmdRequestMemoryLimit>...</DbpropMsmdRequestMemoryLimit>
...
</PropertyList>
You can click on the link below to learn about the above:
Troubleshoot XMLA endpoint connectivity in Power BI - Power BI | Microsoft Learn
DbpropMsmdRequestMemoryLimit Element (XMLA) | Microsoft Learn
How to Get Your Question Answered Quickly
Best Regards
Jianpeng Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thank you Jianpeng Li,
that actualy makes sence, for now i will clear the dateset before refresh and i will be going to tune de dataset to reduce it size there is a lot of cluter in the tabels.
Regards,
Bart
small update on how i resolved this issue for now.. but still would like to know why there is a execption thrown at memory usage below the 25gb.
For now im using a adf pipeline to clear the dataset first with the power bi api (https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/refresh-dataset)
with the body: {"type":"clearValues"}
this clears the dataset, and after that refresh the dataset with the same command and a empty body.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
61 | |
31 | |
30 | |
29 | |
23 |
User | Count |
---|---|
49 | |
45 | |
26 | |
14 | |
12 |