Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.

Reply
PowerBI_LOVER
New Member

PC advise Specification for Developing Dashboards for "Big Data" multiple DAX codes

When working with advanced DAX codes in tools like Power BI, a high-performance machine is crucial for smooth data processing and report development. Here's an updated list of ideal PC specs for developing reports for Big Data with high DAX codes:

  1. Less than 500,000 rows:

    • RAM Requirement: 64GB RAM
    • Processor Core Requirement: Octa-core (e.g., Intel Core i7 or AMD Ryzen 7)
    • Processor Clock Speed: 3.0GHz or higher
    • Storage: 1TB NVMe SSD
  2. 500,000 to 1 million rows:

    • RAM Requirement: 128GB RAM
    • Processor Core Requirement: Dodeca-core (e.g., Intel Core i9 or AMD Ryzen 9)
    • Processor Clock Speed: 3.5GHz or higher
    • Storage: 2TB NVMe SSD
  3. 1 million to 5 million rows:

    • RAM Requirement: 256GB RAM
    • Processor Core Requirement: Hexadeca-core (e.g., Intel Xeon or AMD Ryzen Threadripper)
    • Processor Clock Speed: 4.0GHz or higher
    • Storage: 4TB NVMe SSD
  4. 5 million to 10 million rows:

    • RAM Requirement: 512GB RAM
    • Processor Core Requirement: Octadeca-core (e.g., AMD EPYC or Intel Xeon Scalable)
    • Processor Clock Speed: 4.5GHz or higher
    • Storage: 8TB NVMe SSD
  5. Above 10 million rows:

    • RAM Requirement: 1TB RAM
    • Processor Core Requirement: Multi-socket server-grade processors
    • Processor Clock Speed: 5.0GHz or higher
    • Storage: 16TB NVMe SSD in RAID configuration

These specifications are tailored to handle the demands of developing reports for Big Data with high DAX codes in tools like Power BI. The combination of high RAM, powerful multi-core processors, fast clock speeds, and ample storage capacity ensures that your machine can efficiently process complex data transformations and calculations, resulting in optimized performance when working with advanced DAX codes.

3 REPLIES 3
PowerBI_LOVER
New Member

Hi Miguel Félix,

100 Millions rows with a computer with 32GB of ram and an I7 - If you have only dataset you are viewing maybe.

I have used 32GB on I5 and it can with dataset viewing.

 

I am taking about 600 lines of dax - measures and table codes.

32GB will refresh for over 30mins.

 

So my configuration above will work or alternatives to my configuration for large scale Dataset and Multiple Dax codes.

Thanks

Hi @PowerBI_LOVER,

 

Even if you go for the service the refresh of 100m rows will take longer than 30 minutes specially if you are trying to do a lot of transformations and tables using dax.

 

If you are trying to load 100m rows to desktop has I referred previously has a best practice you should not do that.

 

The dax measures have no impact in terms of refresh since they are only calculated at the time you called them on your visuals.

 

Altough I understand what you want to achieve with this type of configuration large semantic model should be setup with only part of the data and then push the refresh to the service. 

 

On top of this having this large semântica models you also should consider the usage of incremental refresh or aggregation tables, this will reduce the refresh time but also the performance when building your reports. 


Regards

Miguel Félix


Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Check out my blog: Power BI em Português



MFelix
Super User
Super User

Hi @PowerBI_LOVER ,

 

Not sure where you got this setup and number of rows comparision, but I can tell you from experience that I have worked with models with up to 100 Millions rows with a computer with 32GB of ram and an I7.

 

This is not only dependent on the computer performance, but also on the way you built your model and the calculations you do.

 

Off course that if you are working with millions of lines the loading time can take some time but not the dax.

 

However has a best practice if you have models with 10M rows I would not load everything into Power BI I would use some paremeters to crop the data and then on the service would do the full refresh.

 

For the DAX measures depending on what you are doing I also suggest to use external tools (tabular editor) to avoid the waiting time when you do the OK on the dax formula bar.


Regards

Miguel Félix


Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Check out my blog: Power BI em Português



Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebPBI_Carousel

Power BI Monthly Update - February 2025

Check out the February 2025 Power BI update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors