The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi All,
I was just wondering if you could offer some advice. I'm currently creating reports linking to a massive Powerbi Dataset(nearly 1gb big pbix file) as a data source. When I create tables using this dataset which often hold a significant amount of data I get the error "Visual Has Exceeded available Resources(especially if I add a measure to that table).
Guy in a cube semi recently did a video on two way to speed up how long it takes to refresh a dataset. Will a similar of slimming down the dataset I am linking to in my report make these "Visual Has Exceeded available Resources" less likely to happen? If not is there another solution(apart from filtering the report more or getting a Powerbi Premium license) which can regularly be used to solve this problem?
Thanks for your help,
Mark
Is this the message that you get, does it include the part highlighted in red?
Visual Has Exceeded available Resources from Powerbi Dataset Source
If it doesn't then try this article.
https://docs.microsoft.com/en-us/power-bi/visuals/power-bi-data-points
Hi Mariusz,
Thanks for getting back to me. This is the full error message we are getting if that helps:
This visual has exceeded the available resources. Try filtering to decrease the amount of data displayed.
Please try again later or contact support. If you contact support, please provide these details.
More details: Resource Governing: The query exceeded the maximum memory allowed for queries executed in the current workload group (Requested 1048577KB, Limit 1048576KB)
I had a look at the link you sent through, it is useful but I couldn't really see steps I could follow for improving the peformance. Do you have any advice?
Thanks again,
Mark
I think the problem is that you are trying to display too many data points at once, as per error message.
You can look at solutions like drillthrough, this technique will limit the number of records/data points you are looking at one moment and should remove the error.
https://www.youtube.com/watch?v=2x9lLHDbtDk
Hi Mariusz,
Thanks for that, we are already utilising drill through and still generate a table with 5000 rows, purely because of what our data source looks like. Do you think my suggested first changes won't help solve the issue?
Does anyone have any suggestions for not getting this error without premium or filtering in some way? Is it made slower by using a Powerbi dataset as a data source?