I want to identify datasets on our Premium capacity (P2) that are at risk of failing, because they are getting too large for the capacity.
In the Power BI Premium "Capacity Metrics App it is possible to view the size of a Power BI "artifact" (dataset) relative to the "Dataset Refresh Limit" (see yellow line in screenshot below). Based on this Guy in a Cube video, I am assuming artifact size vs dataset refresh limit is a good ratio to assess the risk of a Power BI refresh failing (Or to pre-emptively identify datasets that need optimizing).
Limitations of the Capacity Metrics app, incude:
Therefore, my question is, is there an alternative way to run a query to obtain the memory/RAM consumption of a published Power BI report?
I have read this post that suggests running a Dynamic Management View (DMV) query in DAX studio against a Desktop PBIX file can return the size of columns and tables. Such as
SELECT dimension_name AS tablename,
attribute_name AS columnname,
datatype,(dictionary_size/1024) AS size_kb
FROM $system.discover_storage_table_columns
I can also run this same DMV against the XMLA endpoint (Tabular Server), but I am unclear if this would provide the "True" memory footprint / artifiact size of the Power BI dataset in the Service. For example both the memory/RAM usage linked to the
Can anyone advise if there is a way to run a query to obtain the memory/RAM consumption of a published Power BI report?
Thanks in Advance
Screenshot from Premium Capacity ap showing artifact size
Solved! Go to Solution.
The DMV is the best you can do at the moment. BUT- keep in mind that compression plays a role too and the memory usage in the capacity can and will be different from the dataset file size.
Check the eviction stats. Also - there is a newfangled grey UI bar (black font on grey background, yay!!!) that indicates if a dataset is becoming too large for the capacity.
The DMV is the best you can do at the moment. BUT- keep in mind that compression plays a role too and the memory usage in the capacity can and will be different from the dataset file size.
Check the eviction stats. Also - there is a newfangled grey UI bar (black font on grey background, yay!!!) that indicates if a dataset is becoming too large for the capacity.
Thanks @lbendlin . Yes, it apears the DMV only provides a partial picture of the total memory usage. I was looking to replicate the equivalent view that the capacity metrics app produces. It seems the Capacity app provides the same view of RAM & CPU usage in the Power BI Service, that the Task Manager produces on the desktop. When you say "eviction stats" do you mean the "Evidence" Tab in the Capacity Metrics App (see below)?
No, I mean evictions. When a dataset gets kicked out of memory to make space for another one.