Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi
Following is the error:
Data source error: Memory error: Memory Allocation failure . Try simplifying or reducing the number of queries.
Cluster URI: WABI-US-NORTH-CENTRAL-redirect.analysis.windows.net
Activity ID: 9df6579a-****-4ed0-a66b-08c77*******
Request ID: 5e604617-****-a47b-89f5-1a673*******
Time: 2021-09-08 21:26:29Z
The desktop refresh does happen without error, although the schedule refresh is not performing at all and sends the error.
I have done the same set up for 15 other reports and they work fine (.csv data files from sharepoint folder to Power BI Services). This particular set is giving me errors and I am not able to find any solutions. my workspace is also in premium capacity
can anyone of you let me know what needs to try in this case?
Thanks in advance.
Solved! Go to Solution.
Your issue sounds similar to the issue I saw on another post.
I am pasting the response from @GilbertQ .
It could well be that when you are creating this table (if the underlying data is large) it could cause it to consume a lot of memory.
This is because when you create a calculated table inside the Power BI dataset this is uncompressed and can consume a lot of memory.
The reason it could work on your desktop is because you are only limited by the amount of memory on your desktop which could be 8 or 16 or 32GB of RAM (Memory), whilst in the Power BI Service it is limited.
If you remove the table and it refreshes try and re-create the table in Power Query, this will allow the table to be compressed before it goes into the Power BI dataset
Check for (1) Size of the dataset you are loading (2) Check your premium capacity limit (3) Avoid calculated table at Dax level
If this post helps, then please consider Accept it as the solution, give Kudos!
Your issue sounds similar to the issue I saw on another post.
I am pasting the response from @GilbertQ .
It could well be that when you are creating this table (if the underlying data is large) it could cause it to consume a lot of memory.
This is because when you create a calculated table inside the Power BI dataset this is uncompressed and can consume a lot of memory.
The reason it could work on your desktop is because you are only limited by the amount of memory on your desktop which could be 8 or 16 or 32GB of RAM (Memory), whilst in the Power BI Service it is limited.
If you remove the table and it refreshes try and re-create the table in Power Query, this will allow the table to be compressed before it goes into the Power BI dataset
Check for (1) Size of the dataset you are loading (2) Check your premium capacity limit (3) Avoid calculated table at Dax level
If this post helps, then please consider Accept it as the solution, give Kudos!
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 55 | |
| 50 | |
| 43 | |
| 16 | |
| 15 |