Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hi All,
For the large datasets in semantic model, is there any optimestic settings can we set in Options and setting of power bi desktop?
Based on what parameters, we can set the below values under parallel loading of tables?
maximum number of simultaneous evaluations,
Maximum memory used per simultaneous evaluations (MB)
If anyone can briefly explain with example would appriciate.
Thanks,
Sri
Solved! Go to Solution.
Answer :
Power BI Desktop – Parallel Loading Settings
When you load large datasets in Power BI Desktop, you can control how many table queries run at the same time and how much memory each query can use. This helps speed up refresh or prevent your computer/database from getting overloaded.
1. Maximum number of simultaneous evaluations
Fewer → slower refresh, but safer.
2. Maximum memory per simultaneous evaluation (MB)
Imagine you have a dataset with 10 big tables:
Answer :
Power BI Desktop – Parallel Loading Settings
When you load large datasets in Power BI Desktop, you can control how many table queries run at the same time and how much memory each query can use. This helps speed up refresh or prevent your computer/database from getting overloaded.
1. Maximum number of simultaneous evaluations
Fewer → slower refresh, but safer.
2. Maximum memory per simultaneous evaluation (MB)
Imagine you have a dataset with 10 big tables:
Hi @Koritala
For large datasets in Power BI Desktop, optimizing the Data Load settings can significantly improve refresh stability and performance. However, these settings are primarily about how Power BI Desktop utilizes your machine's resources (CPU and RAM) during the ingestion phase.
Here is a detailed breakdown of the settings and the best practices for handling large semantic models.
1. Global Options: Data Load
In Power BI Desktop, go to File > Options and settings > Options > Global > Data Load.
Maximum number of simultaneous evaluations
This determines how many Power Query queries are processed in parallel.
The Parameter: By default, this matches the number of logical CPU cores on your machine.
When to increase: If you have a high-end CPU (e.g., 16+ cores) and your data source (like a robust SQL Server) can handle many concurrent connections, increasing this can speed up the refresh.
When to decrease: If your machine freezes or "chokes" during refresh, or if you receive "Data Source Error" due to too many concurrent connections to an API or a limited database.
Maximum memory used per simultaneous evaluation (MB)
This sets a RAM ceiling for each individual query container.
The Parameter: The default is 432 MB.
When to increase: If you encounter "Memory Limit Exceeded" errors or if you are performing heavy transformations (Nested Joins, Group By on millions of rows, or complex Unpivots).
Example: If you have 32GB of RAM and you are processing a 10-million-row table with complex merges, increasing this to 1024 MB or 2048 MB can prevent crashes.
Caution: Ensure that (Simultaneous Evaluations × Memory per Evaluation) does not exceed 90% of your available RAM.
2. Current File: Data Load
Go to Options > Current File > Data Load.
Parallel loading of tables:
Default: Power BI decides based on your CPU.
Custom: You can manually set this (up to 30). This is useful if your model has many small/medium tables and you want them to load all at once rather than waiting in a queue.
3. Critical Alternatives for Large Datasets
While the settings above help with the "loading" process, they won't fix a model that is fundamentally too large for the engine. For large-scale BI, consider these strategies:
A. Import Mode vs. DirectQuery
Import Mode (Recommended): Uses the VertiPaq engine to compress data. It is significantly faster for DAX calculations and user interactivity.
DirectQuery: Only use this if you need Real-Time data or if the dataset is far too large to fit into Power BI Capacity RAM (e.g., Petabytes). DirectQuery often results in slower report performance because every visual triggers a SQL query.
B. Query Folding (The "Golden Rule")
Ensure that your Power Query steps are being "folded" back to the source. If you see "View Native Query" grayed out after a complex step, Power BI is doing the heavy lifting in its own memory instead of letting the SQL Server do it. This is the #1 cause of slow refreshes.
C. Incremental Refresh
Instead of reloading the entire dataset every time, set up Incremental Refresh. This allows you to refresh only the latest data (e.g., the last 3 days) while keeping years of historical data static in the service.
D. Aggregations (Composite Models)
Keep the granular, "atomic" data in DirectQuery and create an Imported Aggregation Table for common high-level visuals (e.g., Sales by Year/Month). Power BI will automatically route the query to the fast Imported table whenever possible.
If this post helps, then please consider Accepting it as the solution to help the other members find it more quickly
| User | Count |
|---|---|
| 58 | |
| 45 | |
| 31 | |
| 17 | |
| 16 |
| User | Count |
|---|---|
| 75 | |
| 65 | |
| 46 | |
| 23 | |
| 22 |