Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.
Hello, community,
I need some help understanding this error. I purchased a Fabric F2 capacity to test a data cube I built, and when trying to refresh the data cube, I encountered the following message:
Currently, the cube has 234 MB of memory. To execute the processing and calculations, I gathered the following information:
F2 - Capacity and limitations:
Cubo:
Details of the developed model:
Incremental refresh has been applied, and the data selection is fetching only from the last year.
I would like to understand both the error and how to perform the correct calculations, as I am quite confused by the capacity information provided in Microsoft’s documentation. Practical examples on how to perform the calculations and how to size properly would be very helpful.
Att,
Emerson 😀
Solved! Go to Solution.
F2 Capacity has specific limitations around how much memory can be used for datasets and processing operations at any given time.
Microsoft provides memory allocation limits for each capacity tier, which includes:
Even though your data cube is 234 MB, once the dataset is loaded into memory and calculations or refreshes are applied, memory usage can increase significantly, often by 2-3 times or more.
What I mean if I apply some maths :
If your dataset is 234 MB and you expect 1 refresh, the memory needed could be around:
234×2=468
in other words :
Memory Required=Dataset Size×2×Number of Concurrent Refreshes
If multiple refresh operations are happening concurrently, they might compete for memory, causing this error
Hi,AmiraBedh ,thanks for your concern about this issue.
Your answer is excellent!
And I would like to share some additional solutions below.
Hello,@Xxx_Userpwbi .I am glad to help you.
Thank you very much AmiraBedh for the advice, in fact she gave great examples, here are some additional additions I have for you.
You mentioned that the limit for F2 SKUs is 3GB of RAM, which is true based on the direct description in the documentation. But that's not how that capacity calculation limit is understood in the real world.
The 3GB memory limit is the total amount of memory allocated to you by the system (power bi service).
This capacity is composed of:
The actual file size: e.g. your data cube file. (You mentioned the size of the files for execution)
Memory required by the service: the service itself takes up memory when performing operations such as refreshes, calculations, etc.
When you purchase capacity, corresponding to a series of system services provided in the capacity itself as command data (e.g., the function of refreshing, the function of editing the semantic model online on the SERVICE), they also need to take up a part of the capacity space.
Meanwhile, for a general full refresh (refreshing the whole dataset at once), at least twice the size of the dataset is needed.
This is why it is generally recommended to configure incremental refresh (to reduce capacity usage) for reports that can be configured for incremental refresh.
In summary, you need to consider the following scenarios:
Memory Limit: Although the total memory limit for F2 SKUs is 3GB, this includes memory usage for all concurrent operations. If there are other background processes or services running, they will also take up some memory.
Memory spikes: During a refresh, memory usage may peak momentarily and exceed the available memory, even though the total memory limit is 3GB.
You need to optimize your reports, refresh schedules, etc. according to your actual situation to reduce this kind of refresh failures due to insufficient capacity when executing tasks.
I hope the following information is helpful to you.
URL:
Plan your capacity size - Microsoft Fabric | Microsoft Learn
Optimizing Microsoft Fabric: Identifying and Managing Capacity SKUs - Intellify Solutions
I hope my suggestions give you good ideas, if you have any more questions, please clarify in a follow-up reply.
Best Regards,
Carson Jian,
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,AmiraBedh ,thanks for your concern about this issue.
Your answer is excellent!
And I would like to share some additional solutions below.
Hello,@Xxx_Userpwbi .I am glad to help you.
Thank you very much AmiraBedh for the advice, in fact she gave great examples, here are some additional additions I have for you.
You mentioned that the limit for F2 SKUs is 3GB of RAM, which is true based on the direct description in the documentation. But that's not how that capacity calculation limit is understood in the real world.
The 3GB memory limit is the total amount of memory allocated to you by the system (power bi service).
This capacity is composed of:
The actual file size: e.g. your data cube file. (You mentioned the size of the files for execution)
Memory required by the service: the service itself takes up memory when performing operations such as refreshes, calculations, etc.
When you purchase capacity, corresponding to a series of system services provided in the capacity itself as command data (e.g., the function of refreshing, the function of editing the semantic model online on the SERVICE), they also need to take up a part of the capacity space.
Meanwhile, for a general full refresh (refreshing the whole dataset at once), at least twice the size of the dataset is needed.
This is why it is generally recommended to configure incremental refresh (to reduce capacity usage) for reports that can be configured for incremental refresh.
In summary, you need to consider the following scenarios:
Memory Limit: Although the total memory limit for F2 SKUs is 3GB, this includes memory usage for all concurrent operations. If there are other background processes or services running, they will also take up some memory.
Memory spikes: During a refresh, memory usage may peak momentarily and exceed the available memory, even though the total memory limit is 3GB.
You need to optimize your reports, refresh schedules, etc. according to your actual situation to reduce this kind of refresh failures due to insufficient capacity when executing tasks.
I hope the following information is helpful to you.
URL:
Plan your capacity size - Microsoft Fabric | Microsoft Learn
Optimizing Microsoft Fabric: Identifying and Managing Capacity SKUs - Intellify Solutions
I hope my suggestions give you good ideas, if you have any more questions, please clarify in a follow-up reply.
Best Regards,
Carson Jian,
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello @AmiraBedh ,
Thank you for your response. However, I still have some doubts. This capacity has been exclusively dedicated to this data cube, and it attempts to refresh once a day. Wouldn't the limitation of the F2 SKU be 3GB of memory? Perhaps I did not interpret Microsoft's documentation correctly.
F2 Capacity has specific limitations around how much memory can be used for datasets and processing operations at any given time.
Microsoft provides memory allocation limits for each capacity tier, which includes:
Even though your data cube is 234 MB, once the dataset is loaded into memory and calculations or refreshes are applied, memory usage can increase significantly, often by 2-3 times or more.
What I mean if I apply some maths :
If your dataset is 234 MB and you expect 1 refresh, the memory needed could be around:
234×2=468
in other words :
Memory Required=Dataset Size×2×Number of Concurrent Refreshes
If multiple refresh operations are happening concurrently, they might compete for memory, causing this error
Check out the May 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
73 | |
69 | |
67 | |
45 | |
42 |
User | Count |
---|---|
47 | |
38 | |
28 | |
28 | |
27 |