Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
I have tried this several different ways and always end up crashing my program. I am in need of guidance.
I have a Sales Performance Table that lists my total sales for the company. this is related to the Item Table by the Code field. I would like to create a visual that shows which items (by code) that make up 80% of the total sales. As all of my calculation attemps have resulted in a crash, I am not sure which to include here as I am 7 attemps in and all of those measures and fields are still in the data set.
Any advice?
I used this as a foundation : http://dutchdatadude.com/power-bi-pro-tip-pareto-analysis-with-dax/
The measures I adapted are:
Total Amount= SUM(SalesPerformance[LineSales])
Amount All Products=CALCULATE([Total Amount], ALLNOBLANKROW('Item'[Code]), SalesPerformance[LineType]="Item")
Item Rank=RANKX(ALLNOBLANKROW('Item'[Code]), [Total Amount])
Pareto%Item=DIVIDE([ParetoValueItem], [Amount All Products], BLANK())
Pareto Value Item=SUMX(TOPN([Item Rank], ALLNOBLANKROW('Item'[Code]), [Total Amount]), [Total Amount])
Any Advice: @Sean or @greggyb ?
Proud to be a Super User!
Solved! Go to Solution.
@kcantor So I think we got it => follow the pictures and let me know if this works for you too...
Start by => clicking New Table => enter the formula => this will create the first 2 columns
=> then create the other 4 Calculated COLUMNS => 5 and 6 are the same (just wanted to see if one will be faster than the other)
=> then create the chart instantenously!!! @ThomasDay
and the Chart - use the Pareto % Item in the Visual Level Filters if you want...
Hoping you can help me too. As per the previous post, I have a dataset to be filtered by multiple fields on a table and need a Pareto that works off the total count of another value on the same table, not a sum. Firstly, am I looking at using COUNTX instead of the SUMX in your formula, or somethign more complex, and to enable the filters by the other values, would they have to be incorporated into this new table too?
Hello All,
Looking for advice. Task: to use Pareto, based on profit. I have one calculated column as following:
TotalProfit =
CALCULATE (
SUMX (
'Sales Stats',
'Sales Stats'[Quantity]
* ( 'Sales Stats'[Selling price] - RELATED ( StdCost[STD cost] ) )
)
)
Then I need to calculate Incremental profit, which I'm trying to perform with following:
Incremental Profit =
VAR CurrentProfit = 'Sales Stats'[TotalProfit]
RETURN
SUMX (
FILTER (
'Sales Stats',
'Sales Stats'[TotalProfit] >= CurrentProfit
),
'Sales Stats'[TotalProfit]
)
However, I'm receiving the following error: A circular dependency was detected: Sales Stats[Total Profit], Sales Stats[Incremental], Sales Stats[Total Profit].
I've tried to get rid of VAR and replace it with using EARLIER, but error remains the same.
Any ideas how to solve? Thanks.
For a static analysis, this pattern is much easier: http://www.daxpatterns.com/abc-classification/
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
I can do a static analysis. A pareto has been requested by the PTB.
Proud to be a Super User!
Just to make sure that we don't misunderstand: Pareto-Analysis is a sub-group of ABC-Analysis (actually, an AB-analysis: https://en.wikipedia.org/wiki/ABC_analysis)
The pattern I've posted is the standard-one (static), just simpler than the source you've used.
Then there's also a dynamic one: http://www.daxpatterns.com/abc-classification-dynamic/
There the user can slice years or product groups or whatever to play around further.
You can also do it using M-code:
http://www.poweredsolutions.co/2015/06/15/abc-analysis-with-power-query/
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
@kcantor I'm gonna try this out... let's see if it crashes
Can you provide some sample data? Or at least give more of an idea what the tables look like?
Sales Performance Item Table
LineType LineSales Code
Thanks!
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Sorry it took so long to answer. Had not done Pareto Charts before.
However, I don't seem to experience any problems with this. I don't have blank rows but I don't think this would be an issue...
I'm running the March 2016 64-bit PBI on Win 10 Pro with 16GB RAM.
@thomasday I know you try to push the limits of PBI - have you had any problems creating Pareto Charts with your data sets?
also found this post (Pareto taking too long)
EDIT: Noticed one thing in your formulas...
You are adding a filter only to the Overall Total => This should push your line over 100%
I don't know what you data table looks like but it seems this would result
in filtering values only from the Denominator but those same values will be included in the Numerator
Hey @Sean I've not really gotten into the visualization side of the world...I'm working my way to it. I have used visualizations for massive exports--and so I do know they can get tangled--and when they do I always got a message "too much data--may use sampling" or some such thing. At which point all bets were off--could be blanks or many identical rows.
That said, it only happened when one of the measures was actively ranking other measures in the visualization---making it complicated in that the formula had a "RANKX" variable in it like you have --so it was doing that calc for every row of data points in the column--i.e. running a calc for each one that involved every other data row. So THAT's an efficiency problem and potentially a memory issue esp when the visualization itself is data heavy. Here, there are RANKX, SUMX, and TOPN--all of which probably make internal work tables to do the calcs and who knows when a visualization clears them out--so it's not like using a calc column, query, or some such thing.
Anyway, that's the best I can do here but if your files are BIG, perhaps the lesson is "if there's a sort that can happen--or pre-ranking of some sort--then perhaps you can avoid the calc heavy rankx AND the SUMX part". This is consistent with @ImkeF's suggestions.
Tom
@kcantor, @ThomasDay and @ImkeF this was created using Chapter03.pbix from http://www.sqlbi.com/books/the-definitive-guide-to-dax/companion/
@kcantor I have no idea how big the file you are working on is - but these charts get generated instanteneously...
this Sales table has a little over 100,000 rows - you probably have a lot more... Are you using the 64-bit PBI and what OS?
You could aggregate the data first but I just don't think you should have to - this is why we use PBI - Power
I have created a redacted file that I can email. @Sean @ThomasDay, and @ImkeF please let me know the best way to share this data with you and I will get it sent over.
This works at the manufacture level but not at the item level. I am using 64 bit and it still maxes my computer out at 100% CPU and memory before crashing.
Amount All Items = CALCULATE([Total Amount], ALLNOBLANKROW('Item'[Code]), SalesPerformance[LineType]="Item") |
Amount All Manufacturers = CALCULATE([Total Amount], ALLNOBLANKROW('Item'[mfgName]), SalesPerformance[LineType]="Item") |
Item Rank = RANKX(ALLNOBLANKROW('Item'[Code]), [Total Amount]) |
Manufacturer Rank = RANKX(ALLNOBLANKROW('Item'[mfgName]), [Total Amount]) |
Pareto%Item = DIVIDE([ParetoValueItem], [Amount All Items], BLANK()) |
Pareto%Manufacturer = DIVIDE([ParetoValueManufacturer], [Amount All Manufacturers], BLANK()) |
ParetoValueItem = SUMX(TOPN([Item Rank], ALLNOBLANKROW('Item'[Code]), [Total Amount]), [Total Amount]) |
ParetoValueManufacturer = SUMX(TOPN([Manufacturer Rank], ALLNOBLANKROW('Item'[mfgName]), [Total Amount]), [Total Amount]) |
Total Amount = SUM(SalesPerformance[LineSales]) |
Proud to be a Super User!
What i am looking for is the cumulative running total in a percentage of total sales that each item produces. According to what I have managed to get functional, that would be at around item rank number 2600. the RankX works for me, The pareto %value does not. This will be used in paring down the current offerings.
Proud to be a Super User!
@kcantor as much as I don't like aggregating the data - you'll have to do this to summarize it somehow
Even 2600 rank seems too many... The data just gets so close...
to calculate the cumulative % it has to iterate over the enitre data set for each item - 130371!
The mfgCode chart get generated instantly (then again there are only 219)
My computer ran for about 25 min trying to calcuale the ParetoItem% and then I got - out of memory warning...
How are you gonna do this - have you decided?
I was looking at this post last night
http://www.sqlbi.com/articles/best-practices-using-summarize-and-addcolumns/
Let us know how you resolve this!
I hate to admit it but I feel better knowing it didn't work for you either. For now I am keeping it within PowerPivot and using the built in aggreagations within PowerPivot by displaying a second copy of the total sales column within the pivot using the "Running Total in %" option within the pivot table. This allows the data to be manipulated but is still too large to display on a dashboard using the pin magnaer. While not a perfect solution, it allows the data to work for us as we pare down our offerings to reduce that massive item table to a more manageable size.
The link you shared is interesting, however, and I will spend more time with it as soon as I get a few minutes to spare.
Thank you, by the way, for all the help you offer to everyone here . . . including me.
Proud to be a Super User!
@kcantor So I think we got it => follow the pictures and let me know if this works for you too...
Start by => clicking New Table => enter the formula => this will create the first 2 columns
=> then create the other 4 Calculated COLUMNS => 5 and 6 are the same (just wanted to see if one will be faster than the other)
=> then create the chart instantenously!!! @ThomasDay
and the Chart - use the Pareto % Item in the Visual Level Filters if you want...
Hi @Sean ,
I am using the same logic. But the summary table is not slicing down with any dimension in main table. Anything i am missing here.
Can anyone help me with this. Adding the file you can download : https://we.tl/t-Rn8KOEY6cE
Thanks this help me a lot.
performance is better.
att.
Hi @Renan-Arrieiro ,
I don't understand you. Please use the English language.
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
104 | |
100 | |
99 | |
38 | |
37 |
User | Count |
---|---|
158 | |
124 | |
76 | |
74 | |
63 |