The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I've got a dataset on my power bi workspace that is according to "Bravo for Power BI" 952MB big.
The workspace is linked to Power BI Embedded A2 tier (memory limit 5GB). It often appears that the dataset cannot be refreshed.
"More details: consumed memory 6147 MB, memory limit 4059 MB, database size before command execution 1060 MB."
My report uses incremental refresh.
What are possible root causes for the error?
We've tried with a smaller dataset.
Here it is even worse. How can that happen?
More details: consumed memory 5270 MB, memory limit 5098 MB, database size before command execution 21 MB
@FilipK , It could be due to the transformations applied in Power Query or the setup of your incremental refresh settings.
@SivaMani , thanks for quick response. I did all the checkings, but seems to be all good.
I don't understand, why the increase of SQL performance helps to overcome the issue. What is the relation between the memory error and sql performance. I don't get it.