Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I am facing a DAX performance issue on a Price / Volume / Mix analysis with fixed business rules. The model is based on a fact table with several million rows and a fine-grained level of approximately 87,000 items. Base measures (Revenue, Volume, YTD, Prior Year), including dynamic multi-currency conversion, perform reasonably well (~4–5 seconds). However, the Price, Volume, and Mix effect measures, which rely on SUMMARIZECOLUMNS at this granularity and evaluate multiple YTD measures inside a SUMX, take around 20 seconds per effect, making the user experience unacceptable. The results are correct and validated (sum of effects equals total variance), and the calculation logic cannot be changed. Standard optimizations have already been explored (extensive use of variables, no calculated columns or aggregated tables possible). I am therefore looking for DAX optimization patterns or modeling techniques that could significantly improve performance (target < 5 seconds per measure) in this type of high-volume, dynamic calculation scenario.
Base measures :
Revenue_Current =
VAR AggTable =
SUMMARIZECOLUMNS(
'Orders'[Period],
'Orders'[Currency_Code],
"LocalAmount", SUM('Fact_Transactions'[Amount_Local])
)
VAR AggWithRate =
ADDCOLUMNS(
AggTable,
"ExchangeRate",
CALCULATE(
[Dynamic_Rate],
ALLEXCEPT('Orders', 'Orders'[Period], 'Orders'[Currency_Code])
)
)
RETURN
SUMX(
AggWithRate,
DIVIDE([LocalAmount], [ExchangeRate])
)
Revenue_PriorYear =
VAR AggTablePY =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Orders'[Period],
'Orders'[Currency_Code],
"PYLocalAmount", SUM('Fact_Transactions'[Amount_Local])
),
SAMEPERIODLASTYEAR('Dim_Calendar'[Date])
)
VAR AggWithRate =
ADDCOLUMNS(
AggTablePY,
"ExchangeRate",
CALCULATE(
[Dynamic_Rate],
ALLEXCEPT('Orders', 'Orders'[Period], 'Orders'[Currency_Code])
)
)
RETURN
SUMX(
AggWithRate,
DIVIDE([PYLocalAmount], [ExchangeRate])
)
// Quantity
Volume_Units = SUM('Fact_Transactions'[Units])
Volume_Units_PY = CALCULATE([Volume_Units], SAMEPERIODLASTYEAR('Dim_Calendar'[Date]))
// YTD
YTD_Revenue_Current = TOTALYTD([Revenue_Current], 'Dim_Calendar'[Date])
YTD_Revenue_PriorYear = TOTALYTD([Revenue_PriorYear], 'Dim_Calendar'[Date])
YTD_Volume_Current = TOTALYTD([Volume_Units], 'Dim_Calendar'[Date])
YTD_Volume_PriorYear = TOTALYTD([Volume_Units_PY], 'Dim_Calendar'[Date])
Measures
Avg_Unit_Price_Current =
VAR _rev = [YTD_Revenue_Current]
VAR _vol = [YTD_Volume_Current]
RETURN
IF(
_vol <> 0 && NOT ISBLANK(_vol),
DIVIDE(_rev, _vol, 0),
BLANK()
)
PROBLEMATIC MEASURES (Degraded performance: ~20 seconds)
Avg_Unit_Price_PriorYear =
VAR _rev = [YTD_Revenue_PriorYear]
VAR _vol = [YTD_Volume_PriorYear]
RETURN
IF(
_vol <> 0 && NOT ISBLANK(_vol),
DIVIDE(_rev, _vol, 0),
BLANK()
)
// === Effect Price ===
Effect_Price =
VAR DataTable =
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear],
"VolCurr", [YTD_Volume_Current],
"VolPrior", [YTD_Volume_PriorYear]
)
RETURN
SUMX(
DataTable,
VAR _priceCurr = DIVIDE([RevCurr], [VolCurr])
VAR _pricePrior = DIVIDE([RevPrior], [VolPrior])
VAR _isValid =
NOT ISBLANK(_priceCurr) &&
NOT ISBLANK(_pricePrior) &&
[VolCurr] <> 0 &&
[VolPrior] <> 0
RETURN
IF(
_isValid,
(_priceCurr - _pricePrior) * [VolPrior],
BLANK()
)
)
Pct_Effect_Price =
VAR BasePY = [YTD_Revenue_PriorYear]
VAR PriceEffect = [Effect_Price]
RETURN DIVIDE(PriceEffect, BasePY, 0)
// === Effect VOLUME ===
Effect_Volume =
VAR DataTable =
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear],
"VolCurr", [YTD_Volume_Current],
"VolPrior", [YTD_Volume_PriorYear]
)
RETURN
SUMX(
DataTable,
VAR _priceCurr = DIVIDE([RevCurr], [VolCurr])
VAR _pricePrior = DIVIDE([RevPrior], [VolPrior])
VAR _isValid = NOT ISBLANK(_pricePrior) && [VolPrior] <> 0 && NOT ISBLANK(_priceCurr)
RETURN
IF(
_isValid,
([VolCurr] - [VolPrior]) * _pricePrior,
[RevCurr] - [RevPrior] // Fallback si pas de prix valide
)
)
Pct_Effect_Volume =
VAR BasePY = [YTD_Revenue_PriorYear]
VAR VolumeEffect = [Effect_Volume]
RETURN DIVIDE(VolumeEffect, BasePY, 0)
// === Effect MIX ===
Effect_Mix =
VAR DataTable =
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear],
"VolCurr", [YTD_Volume_Current],
"VolPrior", [YTD_Volume_PriorYear]
)
RETURN
SUMX(
DataTable,
VAR _priceCurr = DIVIDE([RevCurr], [VolCurr])
VAR _pricePrior = DIVIDE([RevPrior], [VolPrior])
VAR _isValid =
NOT ISBLANK(_priceCurr) &&
NOT ISBLANK(_pricePrior) &&
[VolCurr] <> 0 &&
[VolPrior] <> 0
RETURN
IF(
_isValid,
(_priceCurr - _pricePrior) * ([VolCurr] - [VolPrior]),
BLANK()
)
)
Pct_Effect_Mix =
VAR BasePY = [YTD_Revenue_PriorYear]
VAR MixEffect = [Effect_Mix]
RETURN DIVIDE(MixEffect, BasePY, 0)
Total_Variance =
VAR DataTable =
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear]
)
RETURN
SUMX(
DataTable,
[RevCurr] - [RevPrior]
)
Thank You !
Thank you for your proposal.
I took the time to test the measures you suggested and noticed that, initially, retrieving values directly from the fact table using VALUE did not return correct results in my case. This led me to use SUMMARIZECOLUMNS instead, in order to work at a level of aggregation that is consistent with the business requirement.
In addition, I observed that using the proposed measures results in greater performance degradation compared to the measure I had initially implemented.
I would also like to point out that these measures are mainly used in matrix-type visualizations, combined with other dimension tables (for example countries, customers, etc.), and are therefore subject to various filters. In this context, performance remains acceptable as soon as a filter is applied: for instance, when one or more customers are selected, or when the date range is restricted, the response time is better.
So I don't Know what to do.
Hi @akim_no ,
Thank you @krishnakanth240 , @cengizhanarslan , @burakkaragoz for your inputs.
I just wanted to check if the issue has been resolved on your end, or if you require any further assistance. Please feel free to let us know, we’re happy to help!
Thank you
Chaithra E.
Hi @akim_no
1. Materialize the calculation Table with Variables
Instead of calling SUMMARIZECOLUMNS three times (once per effect), create a single materialized table variable
Combined Base Data Measure
VAR BaseData =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear],
"VolCurr", [YTD_Volume_Current],
"VolPrior", [YTD_Volume_PriorYear]
),
ALLSELECTED() // Or your specific filter context
)
// Then use this in all three effects
VAR PriceEffect =
SUMX(
BaseData,
VAR _priceCurr = DIVIDE([RevCurr], [VolCurr])
VAR _pricePrior = DIVIDE([RevPrior], [VolPrior])
RETURN IF(
NOT ISBLANK(_priceCurr) && NOT ISBLANK(_pricePrior),
(_priceCurr - _pricePrior) * [VolPrior],
BLANK()
)
)
2. Pre-compute Unit prices in the SUMMARIZECOLUMNS
Add pre-calculated unit prices to reduce repeated division operations
Effect_Price_Optimized =
VAR DataTable =
CALCULATETABLE(
ADDCOLUMNS(
SUMMARIZECOLUMNS(
'Fact_Transactions'[Item_ID],
"RevCurr", [YTD_Revenue_Current],
"RevPrior", [YTD_Revenue_PriorYear],
"VolCurr", [YTD_Volume_Current],
"VolPrior", [YTD_Volume_PriorYear]
),
"@PriceCurr", DIVIDE([RevCurr], [VolCurr]),
"@PricePrior", DIVIDE([RevPrior], [VolPrior]),
"@IsValid",
NOT ISBLANK([RevCurr]) && NOT ISBLANK([VolCurr]) &&
NOT ISBLANK([RevPrior]) && NOT ISBLANK([VolPrior]) &&
[VolCurr] <> 0 && [VolPrior] <> 0
)
)
RETURN
SUMX(
DataTable,
IF([@IsValid], ([@PriceCurr] - [@PricePrior]) * [VolPrior], BLANK())
)
3. Use physical relationships instead of calculated Rates
The currency conversion in your base measures is expensive. Consider
Add Exchange Rate column directly to fact table if possible
Use a bridge table between Orders and Exchange Rates
Pre-compute converted amounts in Power Query
4. Optimize base Measures with Variables
Your base measures can be optimized
Revenue_Current_Optimized =
VAR LocalAmounts =
SUMMARIZE(
'Fact_Transactions',
'Orders'[Period],
'Orders'[Currency_Code],
"@LocalSum", SUM('Fact_Transactions'[Amount_Local])
)
RETURN
SUMX(
LocalAmounts,
DIVIDE(
[@LocalSum],
CALCULATE(
[Dynamic_Rate],
ALLEXCEPT('Orders', 'Orders'[Period], 'Orders'[Currency_Code])
)
)
)
5. Consider using DAX Studio to analyze Performance
Use DAX Studio to
Check storage engine vs formula engine time
Identify bottleneck queries
Analyze VertiPaq cache usage
Please mark it as a solution with headup if this helps you. Thank You!
I do not have example data to test but I believe that force materialization with ADDCOLUMNS over a base set of items, then use variables referencing the columns in the iterator would be a faster option.
Effect_Price =
VAR Items =
VALUES ( 'Fact_Transactions'[Item_ID] )
VAR T =
ADDCOLUMNS(
Items,
"__RevCurr", [YTD_Revenue_Current],
"__RevPrior", [YTD_Revenue_PriorYear],
"__VolCurr", [YTD_Volume_Current],
"__VolPrior", [YTD_Volume_PriorYear]
)
RETURN
SUMX(
T,
VAR RevCurr = [__RevCurr]
VAR RevPrior = [__RevPrior]
VAR VolCurr = [__VolCurr]
VAR VolPrior = [__VolPrior]
VAR PriceCurr = DIVIDE( RevCurr, VolCurr )
VAR PricePrior = DIVIDE( RevPrior, VolPrior )
RETURN
IF(
VolCurr <> 0 && VolPrior <> 0
&& NOT ISBLANK(PriceCurr)
&& NOT ISBLANK(PricePrior),
(PriceCurr - PricePrior) * VolPrior
)
)
If you can add a pre-aggregated table, do it. The idea is to reduce the Price / Volume grain from transactions to the lowest grain the business rules allow.
Typical aggregate table shape:
Period (month / week / fiscal period)
Item_ID
Currency_Code (if conversion is dynamic)
LocalAmount
Units
Then:
Your base measures read from the aggregate table (much fewer rows)
If you can move work to the source, do it there once instead of recomputing in DAX per query. If rates must stay “dynamic”, you can still push aggregation upstream a few and keep only the final conversion in DAX.
Dates/timestamps (down to second/ms) are a classic killer.
Currency rates stored as high-precision decimals can also create lots of distinct values. --> Round at ingestion (Power Query / SQL) to a sensible precision (e.g., cents for money, 3 decimals for units). Use fixed decimal types where possible (Currency type in Power BI is fixed 4 decimals).
Hi @akim_no ,
You are facing a classic "Nested Iterator" performance bottleneck. Calculating PVM (Price-Volume-Mix) at a granularity of 87,000 items involves millions of context transitions when you account for the currency conversion logic inside your base measures.
The primary issue is the use of SUMMARIZECOLUMNS inside a measure that is being iterated 87,000 times. SUMMARIZECOLUMNS is optimized for top-level queries (DAX Queries) and often forces the engine into inefficient query plans when used inside row-level iterations (Context Transition). It also tends to ignore external filter contexts unless wrapped carefully, leading to unnecessary full-table scans.
Here is the optimization strategy to bring this under 5 seconds.
We need to make two critical changes:
Switch to ADDCOLUMNS(VALUES(...)): This pattern is context-aware and significantly faster for iterating over dimension attributes within a measure.
Hoist the Date Logic: Your current code executes TOTALYTD (which generates a date filter) 87,000 times. We will generate the YTD date filter once in a variable and inject it into the loop.
Here is how you should rewrite the Effect_Price measure. You can apply the same pattern to Volume and Mix.
Effect_Price_Optimized =
-- 1. Hoist the Date Filter OUTSIDE the loop.
-- This prevents the engine from calculating the YTD range 87,000 times.
VAR _YTD_Dates = DATESYTD('Dim_Calendar'[Date])
VAR _YTD_Dates_PY = SAMEPERIODLASTYEAR('Dim_Calendar'[Date])
-- 2. Use VALUES instead of SUMMARIZECOLUMNS for the iteration grain.
-- This respects the current filter context much more efficiently.
VAR _Items = VALUES('Fact_Transactions'[Item_ID])
RETURN
SUMX(
_Items,
-- 3. Calculate Base Metrics in the current Item context
-- We inject the hoisted date filters using CALCULATE
VAR _volCurr = CALCULATE([Volume_Units], _YTD_Dates)
VAR _volPrior = CALCULATE([Volume_Units], _YTD_Dates_PY)
-- Optimization Check: Only proceed if there is valid volume to avoid expensive Revenue calc on empty rows
RETURN
IF(
_volCurr <> 0 && _volPrior <> 0,
VAR _revCurr = CALCULATE([Revenue_Current], _YTD_Dates)
VAR _revPrior = CALCULATE([Revenue_PriorYear], _YTD_Dates) -- Assuming this measure just needs the date range
VAR _priceCurr = DIVIDE(_revCurr, _volCurr)
VAR _pricePrior = DIVIDE(_revPrior, _volPrior)
RETURN
IF(
NOT ISBLANK(_priceCurr) && NOT ISBLANK(_pricePrior),
(_priceCurr - _pricePrior) * _volPrior
)
)
)Eliminating SUMMARIZECOLUMNS Overhead: When SUMMARIZECOLUMNS is used inside a SUMX iterator, the formula engine often cannot push the operation down to the storage engine efficiently (VertiPaq). By using VALUES('Item_ID'), you trigger a simple distinct count on the dictionary, which is practically instant. ADDCOLUMNS then allows the Formula Engine to request the specific metric values for those items in a batch.
Context Freezing: By defining VAR _YTD_Dates, you freeze the list of dates for the YTD period into memory once. When you pass this variable into CALCULATE inside the loop, the engine doesn't have to re-evaluate DATESYTD for every single item.
Short-Circuiting: The IF(_volCurr <> 0 ...) check inside the loop prevents the engine from spending resources calculating Revenue (and its expensive currency conversion) for items that have no volume in the current or prior period, which often drastically reduces the workload in sparse datasets.
If the above is still not fast enough, your bottleneck is the Revenue_Current measure itself. Since it iterates Orders to convert currency, doing this 87,000 times is heavy.
Recommendation: Ensure your Revenue_Current measure also avoids SUMMARIZECOLUMNS. Refactor it to:
Revenue_Current =
SUMX(
SUMMARIZE(
'Orders',
'Orders'[Period],
'Orders'[Currency_Code]
),
VAR _Rate = [Dynamic_Rate] -- Calculated once per Period/Currency
VAR _LocalAmount = CALCULATE(SUM('Fact_Transactions'[Amount_Local]))
RETURN
DIVIDE(_LocalAmount, _Rate)
)Note: SUMMARIZE works perfectly fine here for grouping existing columns. Avoid SUMMARIZE for calculated columns (clustering), but for grouping, it is performant.
Applying the Values + Hoisted Filters pattern to your Price, Volume, and Mix measures should bring your calculation time down from ~20s to the <5s range.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 40 | |
| 36 | |
| 34 | |
| 31 | |
| 27 |
| User | Count |
|---|---|
| 136 | |
| 103 | |
| 66 | |
| 65 | |
| 56 |