Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hello everybody,
I need to calculate the Pareto of the products, but I am unable to do so, PowerBI desktop is exceeding the available memory. I've already tried doing the calculation in several different ways, using variables or not using variables, using calculated columns, etc. But I'm failing.
Download pbix here
One of the DAX measurements I made was the measurement below that is not working.
Paretto% =
VAR vSales = [Sales US$]
VAR vSalesTotal = CALCULATE([Sales US$], ALL('dProducts'))
RETURN
CALCULATE(
[Sales US$],
FILTER(
ALL('dProducts'),
[Sales US$] >= vSales
)
)/vSalesTotal
I've already done research on blogs and even here on the forum, but no solution helped me. Does anyone have a working DAX measure to calculate Pareto?
My data model is very simple, I have a product table related to a sales table. The products table has 200 thousand rows and the sales table has 33 million rows
Solved! Go to Solution.
Hi @Rai_BI ,
Pls try the following measure, which has the best performance and takes ablout 90 seconds.
Paretto% - optimized2 =
VAR vIndex =
ROWNUMBER (
ALL ( 'dProducts'[NAME_PRODUCT],'dProducts'[ID_PRODUCT] ),
ORDERBY ( [Sales Amount], DESC )
)
VAR vSales = [Sales Amount]
VAR vTempTable =
ADDCOLUMNS (
ALL ( 'dProducts'[NAME_PRODUCT],'dProducts'[ID_PRODUCT] ),
"Sales", [Sales Amount]
)
VAR vSalesTotal =
SUMX ( vTempTable, [Sales] )
RETURN
SUMX (
WINDOW ( 0, ABS, vIndex, ABS, vTempTable, ORDERBY ( [Sales], DESC ) ),
[Sales]
) / vSalesTotal
The other optimised measure takes about ten minutes.
From your original measure that would exceed the memory, first optimising to about 10 minutes, then continuing to optimise to 90 seconds, it feels like the limit of the optimisation has been reached, and if 90 seconds doesn't satisfy the demand, there's nothing I can do about it.
Did I answer your question? If yes, pls mark my post as a solution and appreciate your Kudos !
Thank you~
Hello,
You can always use visuals "Pareto by sio2Graphs" or "Pareto+" to quickly construct pareto charts without needing to use measures or DAX. "Pareto+" constructs a legend and table along with the pareto.
😀
sio2Graphs
Using your data and displaying in seconds
Using your data
Hi @Rai_BI ,
I've optimized the metric for you in two ways:
#1:
Paretto% =
VAR vSales = [Sales US$]
VAR vTempTable =
ADDCOLUMNS ( ALL ( 'dProducts' ), "Sales", [Sales US$] )
VAR vSalesTotal =
SUMX ( vTempTable, [Sales] )
RETURN
SUMX ( FILTER ( vTempTable, [Sales] >= vSales ), [Sales] ) / vSalesTotal
#2: Since it is not clear what your calculate environment is, it is possible that this method will not work
Paretto% =
VAR vIndex =
ROWNUMBER (
ALLSELECTED ( 'dProducts'[Product Name] ),
ORDERBY ( [Sales US$], DESC )
)
VAR vSales = [Sales US$]
VAR vTempTable =
CALCULATETABLE (
ADDCOLUMNS ( VALUES ( 'dProducts'[Product Name] ), "Sales", [Sales US$] ),
ALLSELECTED ()
)
VAR vSalesTotal =
SUMX ( vTempTable, [Sales] )
RETURN
SUMX (
WINDOW ( 0, ABS, vIndex, ABS, vTempTable, ORDERBY ( [Sales], DESC ) ),
[Sales]
) / vSalesTotal
In addition to optimizing the measure, we can also increase the memory used for queries by simply changing the Query Limit Simulation option to Unlimited in the settings. Such as:
Did I answer your question? If yes, pls mark my post as a solution and appreciate your Kudos !
Thank you~
Hi @Rai_BI ,
Pls try the following measure, which has the best performance and takes ablout 90 seconds.
Paretto% - optimized2 =
VAR vIndex =
ROWNUMBER (
ALL ( 'dProducts'[NAME_PRODUCT],'dProducts'[ID_PRODUCT] ),
ORDERBY ( [Sales Amount], DESC )
)
VAR vSales = [Sales Amount]
VAR vTempTable =
ADDCOLUMNS (
ALL ( 'dProducts'[NAME_PRODUCT],'dProducts'[ID_PRODUCT] ),
"Sales", [Sales Amount]
)
VAR vSalesTotal =
SUMX ( vTempTable, [Sales] )
RETURN
SUMX (
WINDOW ( 0, ABS, vIndex, ABS, vTempTable, ORDERBY ( [Sales], DESC ) ),
[Sales]
) / vSalesTotal
The other optimised measure takes about ten minutes.
From your original measure that would exceed the memory, first optimising to about 10 minutes, then continuing to optimise to 90 seconds, it feels like the limit of the optimisation has been reached, and if 90 seconds doesn't satisfy the demand, there's nothing I can do about it.
Did I answer your question? If yes, pls mark my post as a solution and appreciate your Kudos !
Thank you~
Thank you, but 10 minutes is too long. I need to found another way that load in maximum 10 seconds
The best performance can be up to 90 seconds, not 10 minutes. This is the limit of optimization, if you don't find any other solution, consider giving me the Solution! Thanks.
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
20 | |
14 | |
11 | |
10 | |
8 |
User | Count |
---|---|
21 | |
15 | |
9 | |
7 | |
6 |