Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Helper IV
Helper IV

There's Not Enough Memory To Complete This Operation

Hi All, 


I am loading the data from SQL server. I am trying to refresh the data for my tables. But the data is not able to refresh and the below is message is reported: 


The operation could not be completed as you do not enough memory. Please try again after you have enough memory. Please some body can help. FYI I have 8 gb of Ram. 

New Member

We are facing the same issue and restarting the gateway machine will resolve the issue. I think there is some memory leakage issue at power bi side (gateway) 




Resolver IV
Resolver IV

Kumar11109, try to do step by step. one refresh per click. i know that's not the best practices but if you continuing with same problem it can help you provisory.

Resolver IV
Resolver IV

Kumar11109, try to do step by step. one refresh per click. i know that's not the best practices but if you continuing with same problem it can help you provisory. 

individual refreshing works fine, but who will do that every month with 50 data sources = 50 clicks 🙂 ?

Maybe your better to consoldate the data into a single data source first using SSIS.


or if that doesn't work try using SSAS tabular, this should then allow using SSIS to build each data set. You can then point your powerbi at this. 


If you've not got any servers that you can use to run ssis/ssas on then another option would be to extract the data to files. SSIS or R could do this for you running from a pc. Then mash the files together in powerbi.



Thank You all, for the all the replies that you gave. I am pretty sure that it will be helpful if I encounter another such problem. I am now shifting to a better p.c. hopefully, that would help. 

What worked for me was that I  cleaned my memory and created a view on SQL to download only the required data. 

Advocate I
Advocate I

I ran into the same problem, and I have 8gb of RAM as well.

Once I figured out which of the tables is causing the problem, I uploaded the model without that specific table. Then, I opened the query editor and uploaded just that table there.


Once you upload a table in the query editor, it allows you to edit the data before uploading it to the data model. I filtered only the data I needed (in my case it was a log table, so I filtered only the past three years) and only then clicked "apply".


In my case it worked great.

Thanks to this tip, I could solve the problem I have been dealing with since the last weekend.  Many thanks for taking the time to leave it here almost 5 years ago !

Here are some of my memory tips for powerbi.


Power BI is definately a memory hog. I use it with a 16Gb machine. 8gb works for most of our users. Restart if you get problems and make sure your memory has been release which can take a while. Close other memory hogs like browsers.


Obviously removed unneeded columns.


Aggregate in SQL or M to a lower grain before you load if you don't need all the detail. 


Filter out rows you don't need again in SQL or M. Maybe only bring the last x days/months data.


Aim for a star schema if possible with id keys joins. Try adding a date dimension with your reporting month and join to your dates.


Turn off the Time Intelligence  Auto Date/Time option if you don't need it as create date dimension in the background for every date field from the smallest to largest dates.


Also reduce the precision of times if you dont need them to be highly accurate. 

Ideally load date and time seperatly both as integers e.g. mins after midnight.


I read here that ordering the data can help the compression if the data is very large. > 1million rows.


here is some details of looking at what is using storage in your model.


If you still too big then you may need to use either SSAS tabular on a server or try Azure Analysis Services (which can import your powerbi model)


How big is the final powerbi doc? I have a couple of docs that are 0.5Gb. 



Hi, I ran into the same problem when only refreshing 67MB data. Do you have any idea why it failed? 

Does it always fail? What about after a reboot?


Ensure you are using 64bit to make best use of memory.


Are you on the latest version?

It always fail, there is daily reboot. We use 64bit memory, profi server memory. It is problom in powerBI not in HW.

Latest version, yes, of course.

Are you doing anything complex? E.g. lots of pivots/unpivot merge duplication & functions.


Are you able to reduce the data sets or disable some of the queries to see what if will complete and isolate where the problem may be.


It should easily cope with 67Mb of data so it might be worth loggin a support ticket to get help.


If you have lots of queries does it get stuck running them all? I have sometimes had to run them one at a time due to time outs.


Are you using Direct Queries? If so try the Query Reduction Options.


Also try different options on the Data Cache (under Data Load in settings) however I've never needed to change these.




Not applicable

Greetings, to me this seems to be a 'bug' in the memory management of the widgets. I have done this in the past versions with no ill affect but in the past 2 versions this has become an issue.


I have a lot of memory available and I am simply pulling a list of dimension objects in (SSAS MD Cube) into a simple table widget. All works well returning less then 100 rows, 4 columns of data. When I add one more Fact column...memory issue happens. Pull that fact item out and add another dim item (now at 5 columns) problem.


Yes, when I filter the data down to a few records it returns fine. However the process to use the UI is to allow the end user to filter down to get a question answered that would imply the initial rendering would not be filtered and should have the capacity to handle, to some real memory limit, a large dataset.  This also fails in a similar way when the report is deployed into PBIRS and view via the browser.

@Anonymous Were you able to resolve this issue? I have the same problem with the table widget.

yes, you had to change the structure and logic of your data model. I am not able explain it better in english, but try to make new merged tables with some logic and then turn off "load" of old data connections. Or ask some dax profesional how to solve it 🙂

This is the solution for this problem if you are having issues using CSV or EXCEL as the data source. For my expereince. My data source is excel and number of data rows is 95k. Power BI desktop is having issues loading the data and returning a memory issue.


Solution: Trimmed down the data to 35k Rows and the issue has been fixed!


Question though : Why Power BI Cannot handle 95k worth of data rows? That is very small in data analytics. Seems like Power BI is just designed to handle small amount of data?


There is absolutely no such limit at all.  I have dealt with much larger datasets with a lot of ease.

Ashish Mathur

That's what i thought too but after several tries it didn't work until we trimmed down to 35k data rows.


Power BI Bug?

For me, my laptop has a 4 GB RAM and before  I run Power BI, there is like 2 GB RAM left. My data set was like 80 MB big. The first 70 MB went smoothly but when it came to the last 10 MB, it suddenly sucked up all the RAM and tell me there is not enough memory. I only applied several basic filters and sorted steps. The os is WIN 10 and the Power BI is the lastest version. 

Helpful resources

Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City


Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.