Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I've got a dataset on my power bi workspace that is according to "Bravo for Power BI" 952MB big.
The workspace is linked to Power BI Embedded A2 tier (memory limit 5GB). It often appears that the dataset cannot be refreshed.
"More details: consumed memory 6147 MB, memory limit 4059 MB, database size before command execution 1060 MB."
My report uses incremental refresh.
What are possible root causes for the error?
We've tried with a smaller dataset.
Here it is even worse. How can that happen?
More details: consumed memory 5270 MB, memory limit 5098 MB, database size before command execution 21 MB
@FilipK , It could be due to the transformations applied in Power Query or the setup of your incremental refresh settings.
@SivaMani , thanks for quick response. I did all the checkings, but seems to be all good.
I don't understand, why the increase of SQL performance helps to overcome the issue. What is the relation between the memory error and sql performance. I don't get it.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Power BI update to learn about new features.