Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I am receiving the following error during my semantic model refresh.
Error:
| Data source error: | Resource Governance: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 24523 MB, memory limit 24523 MB, database size before command execution 1076 MB. See linkid=2159753 to learn more. |
VertiPaq model size: 923 MB .
I have other semantic models with significantly larger VertiPaq row counts, on Power BI Premium capacity, that consume far less memory and refresh successfully.
For this model, I have already attempted to increase the semantic model size and reduce parallel table refresh operations to 6, but the refresh still fails due to memory limits.
What could be causing this refresh to fail despite having a smaller VertiPaq footprint than my other models?
Request ID execution details:
{
"timeStart": "2025-12-20T22:23:11.8380000Z",
"timeEnd": "2025-12-20T22:28:54.1830000Z",
"durationMs": 342345,
"externalQueryExecutionTimeMs": 204084,
"vertipaqJobCpuTimeMs": 118188,
"mEngineCpuTimeMs": 567016,
"totalCpuTimeMs": 1521250,
"executionDelayMs": 259,
"approximatePeakMemConsumptionKB": 25111612, (24GB)
"mEnginePeakMemoryKB": 1214748, (1 GB)
"tabularConnectionTimeoutMs": 43200000,
"commandType": "Batch",
"refreshParallelism": 6,
"vertipaqTotalRows": 137030866, (137 Million)
"intendedUsage": 2,
"errorCount": 5
}
Solved! Go to Solution.
Hii @HaidersBSKY
This usually isn’t about the final VertiPaq size, but the peak memory needed during refresh. During refresh, Power BI temporarily holds multiple copies of tables, expands dictionaries, builds relationships, and runs partitions in parallel, which can push memory far beyond the stored model size. Models with high-cardinality columns, large text fields, many calculated columns, or complex relationships can spike memory even if the VertiPaq footprint looks small. To fix this, reduce refresh parallelism, remove unnecessary columns (especially text/IDs), lower cardinality where possible, avoid calculated columns, and consider incremental refresh. Also ensure the capacity has enough headroom at refresh time, as other workloads can cause refreshes to fail despite smaller models.
Hii @HaidersBSKY
This usually isn’t about the final VertiPaq size, but the peak memory needed during refresh. During refresh, Power BI temporarily holds multiple copies of tables, expands dictionaries, builds relationships, and runs partitions in parallel, which can push memory far beyond the stored model size. Models with high-cardinality columns, large text fields, many calculated columns, or complex relationships can spike memory even if the VertiPaq footprint looks small. To fix this, reduce refresh parallelism, remove unnecessary columns (especially text/IDs), lower cardinality where possible, avoid calculated columns, and consider incremental refresh. Also ensure the capacity has enough headroom at refresh time, as other workloads can cause refreshes to fail despite smaller models.
Alright thanks, will try removing unnecessary columns and reducing refresh parllelism further.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!