Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
I've a workspace assigned to a Premium capacity (A1) but refresh datasets is not working (before assigne a capacity it was working).
Erro de fonte de dados: Resource Governing: This operation was canceled because there wasn’t enough memory to finish running it. Either increase the memory of the Premium capacity where this dataset is hosted or reduce the memory footprint of your dataset by doing things like limiting the amount of imported data. More details: consumed memory 2386 MB, memory limit 2360 MB, database size before command execution 711 MB. Learn more, see https://go.microsoft.com/fwlink/?linkid=2159753.
Cluster URI: WABI-US-WEST2-redirect.analysis.windows.net
ID da atividade: 522a4161-7c51-433a-bd39-dab4fc01fad8
Solicitar ID: e2b680d6-cb17-4ee1-f029-8a1be7c4c5de
Hora: 2022-09-06 16:46:41Z
Solved! Go to Solution.
Hi @Flowy ,
Please note the A1 SKU limit on dataset memory:
Capacity and SKUs in Power BI embedded analytics - Power BI | Microsoft Docs
Refresh is a background operation that can occur when two conditions are met:
When the conditions are not met, the refresh is queued until the conditions are favorable.
For a full refresh, recall that at least double the current dataset memory size is required. If sufficient memory is not available, then the refresh cannot commence until model eviction frees up memory - this means delays until one or more datasets becomes inactive and can be evicted.
For more information, please refer to: Optimize Microsoft Power BI Premium capacities - Power BI | Microsoft Docs
Please check if there is any remaining memory in the workspaces.
Best regards,
Yadong Fang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Flowy ,
Please note the A1 SKU limit on dataset memory:
Capacity and SKUs in Power BI embedded analytics - Power BI | Microsoft Docs
Refresh is a background operation that can occur when two conditions are met:
When the conditions are not met, the refresh is queued until the conditions are favorable.
For a full refresh, recall that at least double the current dataset memory size is required. If sufficient memory is not available, then the refresh cannot commence until model eviction frees up memory - this means delays until one or more datasets becomes inactive and can be evicted.
For more information, please refer to: Optimize Microsoft Power BI Premium capacities - Power BI | Microsoft Docs
Please check if there is any remaining memory in the workspaces.
Best regards,
Yadong Fang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
An A1 sku only allows for very small datasets - 3GB max. You generally need 2-2.5X the memory of your dataset so it can refresh. For a brief period in time your dataset is doubled in size as the old data isn't discarded until the latest refresh can be completely swapped in. That way you lose nothing during a failed refresh. That depends too on how much is swapped out via incremental refresh. The more you have in historical partitions, the less RAM it will take as those aren't swapped out during a refresh.
Capacity and SKUs in Power BI embedded analytics - Power BI | Microsoft Docs
I am honestly not sure why it was working if it was previously Pro as that has a 1GB per dataset limit. Is it possible it was in a Premium Per User workspace - PPU? That has a 100GB dataset.
Regardless, a dataset nearly 3GB in size will not work well in an A1 capacity.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingWe actually have the same experience with pro having no memory limit during dataset refreshs. We are currently thinking about switching back from premium/fabric to pro, because it doesn't have a memory limit and doesn't need to be micro-managed to supply enough memory before a refresh. (upscaling before a refresh, downscaling afterwards). Its also way cheaper for us in our constellation (included in our partner subscription)..
I read that microsoft uses shared ressources for pro workspaces and therefore there is no limit on how much memory is consumed by all the datasets as long as each one is smaller than 1gb in storage size. So you can refresh as many as you want at the same time.
PPU might be a better alternative as it is equivalent to a P3 capacity for memory limits (100GB datasets for example) and is also shared, but you get the advantages of premium features like deployment pipelines, xmla endpoint access, etc.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingCheck out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.