Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreThe FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now
At the moment I have a table of customer cumulations grouped by Cust/Seq/Contract/Part/Date/Release/YYYYMM
| Cust | Seq | Contract | Part | Date | Release | Rev | YYYYMM | CUMULATION |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202012 | 424 |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202012 | 424 |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202012 | 432 |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202012 | 480 |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202101 | 808 |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202101 | 808 |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202101 | 888 |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 74 | 0 | 202101 | 816 |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 75 | 0 | 202101 | 816 |
On PBI the matrix is looking like this
I would like to be able to calculate the variance in PBI for the same group Cust/Seq/Contract/Part/Date/Release/YYYYMM day by day.
Example
| Cust | Seq | Contract | Part | Date | Release | Rev | YYYYMM | CUMULATION | Variance |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202012 | 432 | |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202012 | 432 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202012 | 424 | -2% |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202012 | 424 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202012 | 432 | 2% |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202012 | 480 | 11% |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202012 | 480 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 01/12/2020 | 65 | 0 | 202101 | 816 | 70% |
| CUST1 | 1 | 0035987 | PART1 | 02/12/2020 | 66 | 0 | 202101 | 816 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 03/12/2020 | 67 | 0 | 202101 | 808 | -1% |
| CUST1 | 1 | 0035987 | PART1 | 07/12/2020 | 68 | 0 | 202101 | 808 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 08/12/2020 | 69 | 0 | 202101 | 816 | 1% |
| CUST1 | 1 | 0035987 | PART1 | 09/12/2020 | 70 | 0 | 202101 | 888 | 9% |
| CUST1 | 1 | 0035987 | PART1 | 10/12/2020 | 71 | 0 | 202101 | 888 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 15/12/2020 | 72 | 0 | 202101 | 888 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 16/12/2020 | 73 | 0 | 202101 | 816 | -8% |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 74 | 0 | 202101 | 816 | 0% |
| CUST1 | 1 | 0035987 | PART1 | 17/12/2020 | 75 | 0 | 202101 | 816 | 0% |
Your help will be really appreciate.
Thanks
Solved! Go to Solution.
you may try to add a calculated column like this:
Variance2 =
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
SUM(TableName[CUMULATION]),
ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)
tried and it worked like this:
Hi,
This calculated column formula works
=DIVIDE([CUMULATION],LOOKUPVALUE(data[CUMULATION],data[Date],CALCULATE(MAX(data[Date]),FILTER(data,data[Cust]=EARLIER(data[Cust])&&data[Seq]=EARLIER(data[Seq])&&data[Contract]=EARLIER(data[Contract])&&data[Part]=EARLIER(data[Part])&&data[Date]<EARLIER(data[Date]))),data[Cust],data[Cust],data[Seq],data[Seq],data[Contract],data[Contract],data[Part],data[Part],data[YYYYMM],data[YYYYMM]))-1
Hope this helps.
Using the same table I would like to find variance in a different way: net variance from a previous Rel N. and month(s). Here an example of calculation:
03 June 2022
Variance 202206 = (H6-H5)-P6
Variance 202207 = (I6-I5)-Q6-P6
Variance 202208 = (J6-J5)-R6-Q6-P6
Variance 202209 = (K6-K5)-S6-R6-Q6-P6
Variance 202210 = (L6-L5)-T6-S6-R6-Q6-P6
Variance 202211 = (M6-M5)-U6-T6-S6-R6-Q6-P6
Variance 202212 = (N6-N5)-V6-U6-T6-S6-R6-Q6-P6
Here is like it looks like.
I have no idea of what you want. I am as clear as mud.
you may try to add a calculated column like this:
Variance2 =
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
SUM(TableName[CUMULATION]),
ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)
tried and it worked like this:
@UKNSI-Powerbi , Create a date table join with your date
and have measure like
This Day = CALCULATE(sum('Table'[Qty]))
This Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])))
Last Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])-1))
Last Day = CALCULATE(sum('Table'[Qty]), previousday('Date'[Date]))
and take a diff
Day Intelligence - Last day, last non continous day
https://medium.com/@amitchandak.1978/power-bi-day-intelligence-questions-time-intelligence-5-5-5c324...
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 53 | |
| 40 | |
| 38 | |
| 19 | |
| 18 |
| User | Count |
|---|---|
| 70 | |
| 69 | |
| 34 | |
| 33 | |
| 30 |