Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Anonymous
Not applicable

Calculate required memory

Hello Experts,

 

Can anyone tell me if there is a way to calculate how much memory is required to import and work with a dataset?

 

For example, suppose I want to import a dataset which contains 1 million lines. Is there any way to calculate how much memory my machine will need (or approximately)?

 

Thank you in advance!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @Anonymous ,

It should be related to your data amount and calculate complexity. If your report contains Dax formulas who nested multiple iterator functions, it will affect report performance and spend more memory resource to calculate.

Optimizing nested iterators in DAX

BTW, if you have any advanced operations in query editor, you can consider to add table.buffer/list.buffer to reduce memory spend.

How and When to use List & Table Buffer?

Regards,

Xiaoxin Sheng

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

HI @Anonymous ,

It should be related to your data amount and calculate complexity. If your report contains Dax formulas who nested multiple iterator functions, it will affect report performance and spend more memory resource to calculate.

Optimizing nested iterators in DAX

BTW, if you have any advanced operations in query editor, you can consider to add table.buffer/list.buffer to reduce memory spend.

How and When to use List & Table Buffer?

Regards,

Xiaoxin Sheng

Kerrymr
Helper I
Helper I

Hi Hendrti,

There are a lot of factors involved in how much memory is consumed and how fast visuals are refreshed.

I'm not sure there is a specific answer.

 

My current report utilizes a single table with 10 million rows, 13 columns, and 13 complex visuals on a single report page. 

It uses about 1GB of memory and all my visuals refresh in less than 3 seconds.

 

When you have a small number of rows and columns, the quality of your model is less important.

As you build bigger datasets and more complex models there is more time during development devoted to streamlining the model for speed.

 

You can significantly reduce your memory (and increase the speed) by eliminating unneeded columns from your model and by using DAX studio to streamline the calculations.

 

I hope this helps some.

Kerry

 

Anonymous
Not applicable

Hello Kerry,

This was very helpful indeed, thank you!

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.