Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Vote for your favorite vizzies from the Power BI Dataviz World Championship submissions. Vote now!
At the moment I have a table of customer cumulations grouped by Cust/Seq/Contract/Part/Date/Release/YYYYMM
| Cust | Seq | Contract | Part | Date | Release | Rev | YYYYMM | CUMULATION |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202012 | 424 |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202012 | 424 |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202101 | 808 |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202101 | 808 |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 74 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 75 | 0 | 202101 | 816 |
On PBI the matrix is looking like this
I would like to be able to calculate the variance in PBI for the same group Cust/Seq/Contract/Part/Date/Release/YYYYMM day by day.
Example
| Cust | Seq | Contract | Part | Date | Release | Rev | YYYYMM | CUMULATION | Variance |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202012 | 432 | |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202012 | 432 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202012 | 424 | -2% |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202012 | 424 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202012 | 432 | 2% |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202012 | 480 | 11% |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202101 | 816 | 70% |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202101 | 816 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202101 | 808 | -1% |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202101 | 808 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202101 | 816 | 1% |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202101 | 888 | 9% |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202101 | 888 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202101 | 888 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202101 | 816 | -8% |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 74 | 0 | 202101 | 816 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 75 | 0 | 202101 | 816 | 0% |
Your help will be really appreciate.
Thanks
Solved! Go to Solution.
you may try to add a calculated column like this:
Variance2 =
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
SUM(TableName[CUMULATION]),
ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)
tried and it worked like this:
Hi,
This calculated column formula works
=DIVIDE([CUMULATION],LOOKUPVALUE(data[CUMULATION],data[Date],CALCULATE(MAX(data[Date]),FILTER(data,data[Cust]=EARLIER(data[Cust])&&data[Seq]=EARLIER(data[Seq])&&data[Contract]=EARLIER(data[Contract])&&data[Part]=EARLIER(data[Part])&&data[Date]<EARLIER(data[Date]))),data[Cust],data[Cust],data[Seq],data[Seq],data[Contract],data[Contract],data[Part],data[Part],data[YYYYMM],data[YYYYMM]))-1
Hope this helps.
Using the same table I would like to find variance in a different way: net variance from a previous Rel N. and month(s). Here an example of calculation:
03 June 2022
Variance 202206 = (H6-H5)-P6
Variance 202207 = (I6-I5)-Q6-P6
Variance 202208 = (J6-J5)-R6-Q6-P6
Variance 202209 = (K6-K5)-S6-R6-Q6-P6
Variance 202210 = (L6-L5)-T6-S6-R6-Q6-P6
Variance 202211 = (M6-M5)-U6-T6-S6-R6-Q6-P6
Variance 202212 = (N6-N5)-V6-U6-T6-S6-R6-Q6-P6
Here is like it looks like.
I have no idea of what you want. I am as clear as mud.
you may try to add a calculated column like this:
Variance2 =
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
SUM(TableName[CUMULATION]),
ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)
tried and it worked like this:
@UKNSI-Powerbi , Create a date table join with your date
and have measure like
This Day = CALCULATE(sum('Table'[Qty]))
This Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])))
Last Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])-1))
Last Day = CALCULATE(sum('Table'[Qty]), previousday('Date'[Date]))
and take a diff
Day Intelligence - Last day, last non continous day
https://medium.com/@amitchandak.1978/power-bi-day-intelligence-questions-time-intelligence-5-5-5c324...
Vote for your favorite vizzies from the Power BI World Championship submissions!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 64 | |
| 53 | |
| 42 | |
| 20 | |
| 17 |
| User | Count |
|---|---|
| 121 | |
| 103 | |
| 46 | |
| 30 | |
| 24 |