Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
rlansing
Resolver I
Resolver I

How to Optimize Memory Usage?

Hello,

I am running Win 8.1 Pro with Power BI 64 bit (July 2016 update). I have also installed the On-Premises Enterprise Gateway and am having issues when running my scheduled refreshes. I believe what is happening is that Windows is running out of memory and forcing the mashup program closed. This not only breaks the gateway, forcing me to uninstall, reinstall and recover the gateway, but it is preventing me from refreshing a few large reports.

I have multiple questions:

1. Is there something I can do with my Power BI query / data load to prevent this memory usage? The file itself after it has completed refreshing is around 274mb, but obviously Power BI is using more RAM than that just loading the data from Excel files.

By the way: the data is weekly sales data, so there is a large table with product characteristic information, and other tables with the actual weekly sales. I am cleaning and prepping the product characteristics a little bit before loading to the data tables in Power BI, but I don't know which of these steps causing the most memory usage.

2. Are there configurations I can make on my computer to alleviate this issue. I don't necessarily care about speed as I can plan a refresh at night, as long as it will complete. I have bumped up my virtual memory (page file) to 50GB.

3. Is this an issue with the gateway or mashup program?

 

Thank you for your help!

Bobby

5 REPLIES 5
v-qiuyu-msft
Community Support
Community Support

Hi @rlansing,

 

1. Is there something I can do with my Power BI query / data load to prevent this memory usage? The file itself after it has completed refreshing is around 274mb, but obviously Power BI is using more RAM than that just loading the data from Excel files.

By the way: the data is weekly sales data, so there is a large table with product characteristic information, and other tables with the actual weekly sales. I am cleaning and prepping the product characteristics a little bit before loading to the data tables in Power BI, but I don't know which of these steps causing the most memory usage.

 

In your scenario, you can follow those two articles to optimize data model to improve the performance:

Data Import Best Practices in Power BI

Power BI Performance Tips and Techniques

 

 

2. Are there configurations I can make on my computer to alleviate this issue. I don't necessarily care about speed as I can plan a refresh at night, as long as it will complete. I have bumped up my virtual memory (page file) to 50GB.

 

In my opinion, it would be better to refresh the report when the server (which has On-Premises data gateway and data source) is not heavy.

 

3. Is this an issue with the gateway or mashup program?

 

You can collect and analyze performance counters associated with the On-Premises Data Gateway follow this article: Monitor On-Premises Data Gateways. Also you can check the refresh history to find why the schedule refresh is failed. Then you can follow this article to troubleshoot the issue: Troubleshooting the On-Premises Data Gateway.

 

If you have any question, please feel free to ask.

 

Best Regards,
Qiuyun Yu

 

Community Support Team _ Qiuyun Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thank you. I have used the documents to attempt to optimize the report. I think the issue may require more RAM, so we will increase it to 32GB and see if that can help as well. I will report back if I find a definitive problem in the report that can be fixed.

 

I do think there may be an issue with the gateway though. When I refresh the report in Power BI Desktop, it only uses about 35% RAM while the CPU and disk jump around, but when refreshing using the gateway, it maxes out RAM as well and at times causes this issue.

This issue has returned. Can anyone explain why importing 2-3GB of CSV data would cause 32GB to fill up in less than two minutes? Once the memory is full, Windows starts using the pagefile, which dramatically affects the speed. I have added the issue to the community forum, but I need to know what factors could cause this. Especially when I am not importing anywhere near the amount of data. Switching from Excels to CSVs has improved loading times when the RAM doesn't get too full, but the issue still exists for large datasets.

The issue has been solved! It was a problem with the query steps where two queries were merged together based on text columns. In large datasets I think it required too much memory to perform the table scan necessary to complete the full outer join. I switched it to match based on a whole number and it still takes a little bit of time, but the RAM is much more reasonable and the computer does not become unresponsive.

Hi @rlansing, Another tip that you might follow is to check your data types in the Data Model. I've noticed that a stray number that appears as Text in Data Model, consumes a lot more memory than it should. Moreover, it is worth considering making your "joins" in the data model. Or rather try to migrate some of this work into there.

Also on the query side. Some of the functions have the ability to "stream" data. I know at least one binary function that does that 🙂 BinaryFormat.Choice

Helpful resources

Announcements
September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Top Solution Authors