Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello Experts,
Can anyone tell me if there is a way to calculate how much memory is required to import and work with a dataset?
For example, suppose I want to import a dataset which contains 1 million lines. Is there any way to calculate how much memory my machine will need (or approximately)?
Thank you in advance!
Solved! Go to Solution.
HI @Anonymous ,
It should be related to your data amount and calculate complexity. If your report contains Dax formulas who nested multiple iterator functions, it will affect report performance and spend more memory resource to calculate.
Optimizing nested iterators in DAX
BTW, if you have any advanced operations in query editor, you can consider to add table.buffer/list.buffer to reduce memory spend.
How and When to use List & Table Buffer?
Regards,
Xiaoxin Sheng
HI @Anonymous ,
It should be related to your data amount and calculate complexity. If your report contains Dax formulas who nested multiple iterator functions, it will affect report performance and spend more memory resource to calculate.
Optimizing nested iterators in DAX
BTW, if you have any advanced operations in query editor, you can consider to add table.buffer/list.buffer to reduce memory spend.
How and When to use List & Table Buffer?
Regards,
Xiaoxin Sheng
Hi Hendrti,
There are a lot of factors involved in how much memory is consumed and how fast visuals are refreshed.
I'm not sure there is a specific answer.
My current report utilizes a single table with 10 million rows, 13 columns, and 13 complex visuals on a single report page.
It uses about 1GB of memory and all my visuals refresh in less than 3 seconds.
When you have a small number of rows and columns, the quality of your model is less important.
As you build bigger datasets and more complex models there is more time during development devoted to streamlining the model for speed.
You can significantly reduce your memory (and increase the speed) by eliminating unneeded columns from your model and by using DAX studio to streamline the calculations.
I hope this helps some.
Kerry
Hello Kerry,
This was very helpful indeed, thank you!
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
104 | |
69 | |
48 | |
41 | |
34 |
User | Count |
---|---|
164 | |
112 | |
62 | |
54 | |
38 |