Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
sanchar123
Helper I
Helper I

Refresh failure in services for SharePoint data

Hi 

Following is the error:

Data source error: Memory error: Memory Allocation failure . Try simplifying or reducing the number of queries.
Cluster URI: WABI-US-NORTH-CENTRAL-redirect.analysis.windows.net
Activity ID: 9df6579a-****-4ed0-a66b-08c77*******
Request ID: 5e604617-****-a47b-89f5-1a673*******
Time: 2021-09-08 21:26:29Z

 

The desktop refresh does happen without error, although the schedule refresh is not performing at all and sends the error.

 

I have done the same set up for 15 other reports and they work fine (.csv data files from sharepoint folder to Power BI Services). This particular set is giving me errors and  I am not able to find any solutions. my workspace is also in premium capacity

 

can anyone of you let me know what needs to try in this case?

 

Thanks in advance.

1 ACCEPTED SOLUTION
ponnusamy
Solution Supplier
Solution Supplier

@sanchar123 :

Your issue sounds similar to the issue I saw on another post. 

I am pasting the response from @GilbertQ .

 

It could well be that when you are creating this table (if the underlying data is large) it could cause it to consume a lot of memory.

This is because when you create a calculated table inside the Power BI dataset this is uncompressed and can consume a lot of memory.

The reason it could work on your desktop is because you are only limited by the amount of memory on your desktop which could be 8 or 16 or 32GB of RAM (Memory), whilst in the Power BI Service it is limited.

If you remove the table and it refreshes try and re-create the table in Power Query, this will allow the table to be compressed before it goes into the Power BI dataset

 

https://community.powerbi.com/t5/Service/Memory-Allocation-failure-Try-simplifying-or-reducing-the-n...

 

Check for (1) Size of the dataset you are loading (2) Check your premium capacity limit (3) Avoid calculated table at Dax level

 

If this post helps, then please consider Accept it as the solution, give Kudos!

 

View solution in original post

1 REPLY 1
ponnusamy
Solution Supplier
Solution Supplier

@sanchar123 :

Your issue sounds similar to the issue I saw on another post. 

I am pasting the response from @GilbertQ .

 

It could well be that when you are creating this table (if the underlying data is large) it could cause it to consume a lot of memory.

This is because when you create a calculated table inside the Power BI dataset this is uncompressed and can consume a lot of memory.

The reason it could work on your desktop is because you are only limited by the amount of memory on your desktop which could be 8 or 16 or 32GB of RAM (Memory), whilst in the Power BI Service it is limited.

If you remove the table and it refreshes try and re-create the table in Power Query, this will allow the table to be compressed before it goes into the Power BI dataset

 

https://community.powerbi.com/t5/Service/Memory-Allocation-failure-Try-simplifying-or-reducing-the-n...

 

Check for (1) Size of the dataset you are loading (2) Check your premium capacity limit (3) Avoid calculated table at Dax level

 

If this post helps, then please consider Accept it as the solution, give Kudos!

 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.