Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi,
Please help with the correct DAX to achieve the below:
I want to identify high risk customers - by counting the number of times the cumulative arrears total for each customer increases over consecutive months in an “over 120 days” category.
Monthly I receive a dataset with customers with their amounts in arrears. The arrear amounts are grouped in one of four progressive groups, including an “over 120 days” category.
If the arrears total in the “over 120 days” keeps on increasing month after month, it means the customer is not paying and should be monitored more closely.
Question: Please help with the DAX to count the number of times, for each customer, the cumulative total in the “over 120 days” arrears category increases over consecutive months – where each consecutive increase counts as 1.
The hope is to create a table ranking the customers with counts per year sorting from highest count to lowest. See below ideal outcome as per the attached pbix file with the data.
For John, the calculation logic is as follows:
For Peter, the calculation logic is as follows:
Here is the link to the PBIX file with the example data as copied below. Has a date table.
Thank you very much!
W
Solved! Go to Solution.
Hi @Water
I guess it would be much easier to create a calculated column that tracks the changes. Attached sample file file with the proposed solution.
Thank you very much, @SachinNandanwar, for you continued support and help trying to solve my problem. It is really amazing that you would spend your time to help me. I really appreciate it!
You solution works very well and is light on the computing resources. It also offers additional options to summarize further and count overall totals.
In exchange for computing elegance, it does loose its link to the main table with additional possible drill-downs. I must still figure out how to do some of these.
But overall, a brilliant solution that offers valuable benifits.
Thank you again for your time and help.
Best wishes,
W
ps. Just for the record, below the DAX and steps if it can help someone else:
==================================Step 1
Tbl_Union =
Thank you very much, @SachinNandanwar, for you continued support and help trying to solve my problem. It is really amazing that you would spend your time to help me. I really appreciate it!
You solution works very well and is light on the computing resources. It also offers additional options to summarize further and count overall totals.
In exchange for computing elegance, it does loose its link to the main table with additional possible drill-downs. I must still figure out how to do some of these.
But overall, a brilliant solution that offers valuable benifits.
Thank you again for your time and help.
Best wishes,
W
ps. Just for the record, below the DAX and steps if it can help someone else:
==================================Step 1
Tbl_Union =
Thank you very much, @SachinNandanwar! I can't wait to try this tomorrow once back in office. Will give you feedback as soon as possible, but I have a good feeling!
Hi @Water
I guess it would be much easier to create a calculated column that tracks the changes. Attached sample file file with the proposed solution.
Thank you very much, @tamerj1 ! You solution works very well.
Pros: Since it is part of the main data table, it allows additional drilldowns, slicers, and analysis without any problems.
Cons: It gave the 'exceeded available resources' error a couple of times. I found that if I remove the final measure with the counts from my tables before a refresh, and then add it afterwards, it worked. Also, if a download a copy, refresh, and then re-upload I could work around it. Not ideal and I am not sure auto-refreshes on Power Bi Online will work. Must still test that.
But overall, a great alternative that offers valuable benifits.
Thank you again for your time and help.
Best wishes,
W
@Water
It is difficult to tell what could be the reason without testing the measure on the real data model. However, SUMMARIZE sometimes have to cluster the data in order to perform the aggregation which results in a large number of storage engine queries.. I would suggest trying the following:
Count Over 120 Days Increase =
SUMX (
ADDCOLUMNS (
SUMMARIZE (
Japie,
Japie[Date],
Japie[Customer Name]
),
"@Count", CALCULATE ( MAX ( Japie[Over 120 Days Increase] ) )
),
[@Count]
)
Thank you very much, @tamerj1! I will test you solution tomorrow and give you feedback asap. Best wishes.
Why should Sue have a count of 1 ? She doesnt have a series of continous arrears ?
Good question! Initially I did not give Sue any count, but then I wondered about the initial theoretical increase from August 2023 (theoretical $0) to the $1000 in September 2023. For all customers there will be an increase (count) from nothing to something at some stage if they start to accrue in the "Over 120 Days" category. However, it doesn't really matter if that initial count is included or not. The focus will be on the top/highest counts, and therefor an additional count per customer will not matter. So if you can suggest a solution where Sue doesn't get a count, it will still be good! Thanks!
I admire your brilliance, @SachinNandanwar! Thank you so much again for your time and help.
There is a situation where your solution doesn't work, and only because I haven't included it in my example dataset. However, I believe this will be an easy fix for you. Please help me?
For quite a substantial number of costomers - they have more than one entry per month as per various Rate Codes or Contract numbers, so more than one amount in the "Over 120 Days" category - which should be summed each month for one overall monthy total. The monthly totals should then be used in the calculation logic.
If I use your current solution as is, it compares the various smaller amounts each month, meaning I get an incorrect inflated count.
See e.g. Japie in 2023. The below should be summed each time for one total for each month.
Here are the correct 2023 totals per month for Japie.
These totals should be compared. (Unfortunately in my initial example dataset I have included only one total per customer per month, and not this situation where various amounts had to be summed for one monthly total.)
Is it possible to please adjust your DAX to accomplish this - use the montly summed totals to do the comparision and count?
I attach here the pbix file with the new dataset with your current solution applied. (The raw Excel data is here.) Added a YearMonth field in the table which you might need to get a monthly sum(?).
I left the current incorrect output in the file:
The correct output should be (which I hope would be possible with a small tweak to your current DAX as included in this file):
Thank you so much again for your kind help!
W
Sorry for the delay in response, we must be on quite diffferent timezone. It is now early morning here.
I had to look again myself, but it is becuase there is no increase for Japie between Aug 2023 and September 2023.
Best regards,
W
Can you please let me know how is the performance of the solution given that your dataset is in millions ? If the performance isnt satisfactory then there is no point of continuing with the approach.
Sorry again for the delayed response. Timezone differences.
Sad news indeed. "There's not enough memory to complete this operation. Please try again later when there may be more memory available."
I get this with the Tbl_Union first step already without having created the Final_Table at all.
Clever of you to check this.
I don't understand why I would get this error now, while with you other solution it works well on the same dataset. Isn't the logic the same where you compare two things and get a count?
Do you think there is a way to break the steps down in seperate phases to limit the amount of calculations done at once? Could one e.g. create an initial and seperate smaller summary table with only the monthly totals , and then apply the comparison calculations in a separate measure?
Thanks and best regards,
W
I am based in India hence the time zone difference.
Data for Japie is missing for Dec 2023.How to handle this ?
Also when there is no difference in values across the consecutive months should it be taken into account for the final count ?
According to your screenshot , for SANNIE the count for year 2021 is 5 .Can you elaborate how is that calculated.I think the count should be 4 and not 5.
This is the final output that I am getting.
Yes, I am in western Canada. So midday here and you are probably having a good night's rest right now. 🙂
Question: Data for Japie is missing for Dec 2023.How to handle this ?
Answer: I have already requested the missing data so that my dataset can be complete. So this issue should disappear. Your current output seems to overcome this in any case giving the output as I think it should be.
Question: Also when there is no difference in values across the consecutive months should it be taken into account for the final count ?
Answer: Your current output seems to address this correctly giving the output as I think it should be. It "ignores" values where there is no differerence over consecutive months and only counts where there is a increase between any two consecutive months.
Question: According to your screenshot , for SANNIE the count for year 2021 is 5 .Can you elaborate how is that calculated.I think the count should be 4 and not 5.
Answer: You are correct. December 2020 is more than January 2021. Therefore the first increase to count is Feb to March 2021 and then further to total of 4 for 2021. I am sorry for this confusion. I based my initial count on a dataset excluding 2020 which I later added to the dataset I provided you with and counted it as below. Forgot to recheck after adding 2020 data. Comes from working too late at night! Thank you for verifying.
Comment: This is the final output that I am getting.
Answer: This looks perfect! May I ask if you used a different method to calculate this?
With thanks and my best regards,
W
Hi,
Please check the below picture and the attached pbix file.
I tried to modify data model a little bit, and I added new calendar table and a customer dimension table.
OFFSET function (DAX) - DAX | Microsoft Learn
WINDOW function (DAX) - DAX | Microsoft Learn
Cumulate count by condition: =
VAR _t =
CALCULATETABLE (
SUMMARIZE (
Arrears,
Customer_dimension[Customer Name],
Calendar_dimension[Year],
Calendar_dimension[Year-Month],
Calendar_dimension[Year-Month sort]
),
ALLSELECTED (
Calendar_dimension[Year-Month],
Calendar_dimension[Year-Month sort]
)
)
VAR _over120days =
ADDCOLUMNS ( _t, "@over120days", [Over 120 days:] )
VAR _previousmonth =
ADDCOLUMNS (
_over120days,
"@previousmonth",
CALCULATE (
[Over 120 days:],
OFFSET (
-1,
_over120days,
ORDERBY ( Calendar_dimension[Year-Month sort], ASC ),
,
PARTITIONBY ( Customer_dimension[Customer Name] )
)
)
)
VAR _condition =
ADDCOLUMNS (
_previousmonth,
"@condition", IF ( [@over120days] > [@previousmonth], 1 )
)
RETURN
SUMX (
WINDOW (
0,
ABS,
0,
REL,
_condition,
ORDERBY ( Calendar_dimension[Year-Month sort], ASC )
),
[@condition]
)
If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.
Thank you again, @Jihwan_Kim! I appreciate your expert help.
I have tried your ingenious solution on my real dataset, and unfortunately keep on running into the below errror:
It works perfectly on smaller datasets which I used to verify the accuracy of your solution, but the moment I try it on the whole dataset (6 millions rows) to see year-over-year risks per person (what we hope to do) , it gets stuck on this available resource error - even if I set the Query-limit simulations to "No query limits".
Do you think there is a way to break the steps down in seperate phases to limit the amount of final calculations at the end? Could one e.g. create an initial and seperate summary table, and then apply the final calculations in a separate measure? If so, or maybe with some other ingenious method, could you please guide me again on how to do that? My own DAX experience is just too limiting to allow me a way forward.
With thanks and my best regards,
W
Thank you very much, Jihwan_Kim! Really appreciate your help and time spend on this. I will study your solution and test it on my real dataset. Will definately accept as solution if all works well. Please give me a couple of days.
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
16 | |
10 | |
8 | |
8 | |
7 |