Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
When working with advanced DAX codes in tools like Power BI, a high-performance machine is crucial for smooth data processing and report development. Here's an updated list of ideal PC specs for developing reports for Big Data with high DAX codes:
Less than 500,000 rows:
500,000 to 1 million rows:
1 million to 5 million rows:
5 million to 10 million rows:
Above 10 million rows:
These specifications are tailored to handle the demands of developing reports for Big Data with high DAX codes in tools like Power BI. The combination of high RAM, powerful multi-core processors, fast clock speeds, and ample storage capacity ensures that your machine can efficiently process complex data transformations and calculations, resulting in optimized performance when working with advanced DAX codes.
Hi Miguel Félix,
100 Millions rows with a computer with 32GB of ram and an I7 - If you have only dataset you are viewing maybe.
I have used 32GB on I5 and it can with dataset viewing.
I am taking about 600 lines of dax - measures and table codes.
32GB will refresh for over 30mins.
So my configuration above will work or alternatives to my configuration for large scale Dataset and Multiple Dax codes.
Thanks
Hi @PowerBI_LOVER,
Even if you go for the service the refresh of 100m rows will take longer than 30 minutes specially if you are trying to do a lot of transformations and tables using dax.
If you are trying to load 100m rows to desktop has I referred previously has a best practice you should not do that.
The dax measures have no impact in terms of refresh since they are only calculated at the time you called them on your visuals.
Altough I understand what you want to achieve with this type of configuration large semantic model should be setup with only part of the data and then push the refresh to the service.
On top of this having this large semântica models you also should consider the usage of incremental refresh or aggregation tables, this will reduce the refresh time but also the performance when building your reports.
Regards
Miguel Félix
Proud to be a Super User!
Check out my blog: Power BI em PortuguêsHi @PowerBI_LOVER ,
Not sure where you got this setup and number of rows comparision, but I can tell you from experience that I have worked with models with up to 100 Millions rows with a computer with 32GB of ram and an I7.
This is not only dependent on the computer performance, but also on the way you built your model and the calculations you do.
Off course that if you are working with millions of lines the loading time can take some time but not the dax.
However has a best practice if you have models with 10M rows I would not load everything into Power BI I would use some paremeters to crop the data and then on the service would do the full refresh.
For the DAX measures depending on what you are doing I also suggest to use external tools (tabular editor) to avoid the waiting time when you do the OK on the dax formula bar.
Regards
Miguel Félix
Proud to be a Super User!
Check out my blog: Power BI em PortuguêsMarch 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
87 | |
81 | |
53 | |
38 | |
35 |