The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello,
I am about to go insane. I need help regarind ABC Clustering my Customers. I have a classic star Scheme where my Transactions are connected to the Customer Dimension via Customer_KEY. Important Side note: I cant create calculated columns, since PowerBI cant manage so much data. It gives me Errors that my RAM is insufficient.
I want to analyse how much in per cent of my Volume compared to the total volume is made by my first 10 Customers and Cluster them into A, B and C.
I really can't find the mistake in my DAX-Formulas. I worked with different LLMs but still I dont get any good solutions for that.
I can tell you what I have done up to now. I will give you the Names in the following:
Dim_Cust - Customer Dimension
DIM_Cust[Key] - Primary Key
Fact_TRX - Fact Table for Transactions
TRX_Vol - Transaction Volume
Step1: Calculate the Total Volume
Solved! Go to Solution.
Hi All,
Firstly DataNinja777 thank you for your solution!
And @Theo_Alex ,It looks like your problem is that you have exceeded the memory capacity by adding an extra calculation column, so in this case, you can use measure's ranked dax to make sure that your capacity is able to keep you up and running, here's what I've done for your reference.
Customer_Rank =
RANKX(
ALL(Dim_Cust),
[Customer_Transaction_Volume],
,
DESC,
DENSE
)
Cumulative_Transaction_Volume =
VAR CurrentRank = [Customer_Rank]
RETURN
CALCULATE(
SUM(Fact_TRX[TRX_Vol]),
FILTER(
ALL(Dim_Cust),
[Customer_Rank] <= CurrentRank
)
)
If you have any other questions, you can check out the pbix file I uploaded, I hope it helps, and I'd be honored if I could solve your problem!
Hope it helps!
Best regards,
Community Support Team_ Tom Shen
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Theo_Alex ,
Your current approach is solid, but there’s a sneaky issue in how you calculate cumulative transaction volume. Specifically, the problem is that the FILTER function you’re using to accumulate values is pulling in more customers than it should, which messes up the cumulative share. The result? You get cumulative shares exceeding 100%, which is frustrating as hell when you're trying to cluster your customers logically.
Let’s fix it step by step. First, your Total_Volume measure looks fine:
Total_Volume = SUM(Fact_TRX[TRX_Vol])
You’re summing up the transaction volume across all records, which is exactly what you need to benchmark the cumulative share.
Your Volume_Per_Customer measure is also solid. It calculates the total transaction volume per customer by removing all other filters except for the Dim_Cust[Key], which ensures you’re looking at each customer’s aggregated volume:
Volume_Per_Customer =
CALCULATE(
SUM(Fact_TRX[TRX_Vol]),
ALLEXCEPT(Dim_Cust, Dim_Cust[Key])
)
Where things start to go wrong is in your cumulative volume calculation. The issue comes from how the FILTER function is applied. You were using MAXX(ALL(Dim_Cust, Dim_Cust[Key])...) to compare ranks, but that’s introducing an incorrect filter context. Instead, you need to ensure that your cumulative calculation only considers customers with a rank up to the current customer. Here's the corrected measure:
Cumulative_Volume =
CALCULATE(
SUM(Fact_TRX[TRX_Vol]),
FILTER(
ALL(Dim_Cust),
[Customer_Rank] <= MAX(Dim_Cust[Customer_Rank])
)
)
Notice how I’m using ALL(Dim_Cust) inside the FILTER function. This ensures that I’m considering all customers when ranking them, but I’m applying the cumulative sum only to those with a rank less than or equal to the current customer’s rank.
Next, your cumulative share should simply divide this cumulative volume by the total volume:
Cumulative_Share =
DIVIDE(
[Cumulative_Volume],
[Total_Volume],
0
)
Finally, let’s categorize the customers into A, B, and C clusters based on their cumulative share. You’re right to use a SWITCH function here:
ABC_Category =
SWITCH(
TRUE(),
[Cumulative_Share] <= 0.10, "A",
[Cumulative_Share] <= 0.20, "B",
"C"
)
What was happening before is that your cumulative share was being calculated incorrectly, so your clusters were off. Now, the cumulative share will stay within the 0-100% range, and the ABC classification will work as expected. This setup will dynamically update as new transaction data comes in.
Best regards,
HI Again,
I added a calculated Column with ranks in my Customer Dimension and changed all Measures but I cant visualize my data in a Table. "The query exceeds the available resources"
Hi All,
Firstly DataNinja777 thank you for your solution!
And @Theo_Alex ,It looks like your problem is that you have exceeded the memory capacity by adding an extra calculation column, so in this case, you can use measure's ranked dax to make sure that your capacity is able to keep you up and running, here's what I've done for your reference.
Customer_Rank =
RANKX(
ALL(Dim_Cust),
[Customer_Transaction_Volume],
,
DESC,
DENSE
)
Cumulative_Transaction_Volume =
VAR CurrentRank = [Customer_Rank]
RETURN
CALCULATE(
SUM(Fact_TRX[TRX_Vol]),
FILTER(
ALL(Dim_Cust),
[Customer_Rank] <= CurrentRank
)
)
If you have any other questions, you can check out the pbix file I uploaded, I hope it helps, and I'd be honored if I could solve your problem!
Hope it helps!
Best regards,
Community Support Team_ Tom Shen
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
HI!
Thanks for your effort. The thing is with calculating the Comulativ_volume:
Cumulative_Volume = CALCULATE( SUM(Fact_TRX[TRX_Vol]), FILTER( ALL(Dim_Cust), [Customer_Rank] <= MAX(Dim_Cust[Customer_Rank]) ) )
Since I have a lot of customers PowerBi is not able to calculate the Rank via a calculated Column. That's why I can't use the Max() Formula to calculate the cumulative_Volume. I always get the error that I have no sufficient Memory 😞
User | Count |
---|---|
81 | |
74 | |
42 | |
30 | |
28 |
User | Count |
---|---|
108 | |
96 | |
53 | |
48 | |
47 |