Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
UKNSI-Powerbi
Frequent Visitor

Row level variance

At the moment I have a table of customer cumulations grouped by  Cust/Seq/Contract/Part/Date/Release/YYYYMM

 

CustSeqContractPartDateReleaseRevYYYYMMCUMULATION
CUST1    10035987PART1    01/12/2020650202012432
CUST1    10035987PART1    02/12/2020660202012432
CUST1    10035987PART1    03/12/2020670202012424
CUST1    10035987PART1    07/12/2020680202012424
CUST1    10035987PART1    08/12/2020690202012432
CUST1    10035987PART1    09/12/2020700202012480
CUST1    10035987PART1    10/12/2020710202012480
CUST1    10035987PART1    15/12/2020720202012480
CUST1    10035987PART1    16/12/2020730202012480
CUST1    10035987PART1    01/12/2020650202101816
CUST1    10035987PART1    02/12/2020660202101816
CUST1    10035987PART1    03/12/2020670202101808
CUST1    10035987PART1    07/12/2020680202101808
CUST1    10035987PART1    08/12/2020690202101816
CUST1    10035987PART1    09/12/2020700202101888
CUST1    10035987PART1    10/12/2020710202101888
CUST1    10035987PART1    15/12/2020720202101888
CUST1    10035987PART1    16/12/2020730202101816
CUST1    10035987PART1    17/12/2020740202101816
CUST1    10035987PART1    17/12/2020750202101

816

 

On PBI the matrix is looking like this

UKNSIPowerbi_0-1673004647060.png

 

 

 

I would like to be able to calculate the variance in PBI for the same group Cust/Seq/Contract/Part/Date/Release/YYYYMM day by day.

 

Example 

CustSeqContractPartDateReleaseRevYYYYMMCUMULATIONVariance
CUST1    10035987PART1    01/12/2020650202012432 
CUST1    10035987PART1    02/12/20206602020124320%
CUST1    10035987PART1    03/12/2020670202012424-2%
CUST1    10035987PART1    07/12/20206802020124240%
CUST1    10035987PART1    08/12/20206902020124322%
CUST1    10035987PART1    09/12/202070020201248011%
CUST1    10035987PART1    10/12/20207102020124800%
CUST1    10035987PART1    15/12/20207202020124800%
CUST1    10035987PART1    16/12/20207302020124800%
CUST1    10035987PART1    01/12/202065020210181670%
CUST1    10035987PART1    02/12/20206602021018160%
CUST1    10035987PART1    03/12/2020670202101808-1%
CUST1    10035987PART1    07/12/20206802021018080%
CUST1    10035987PART1    08/12/20206902021018161%
CUST1    10035987PART1    09/12/20207002021018889%
CUST1    10035987PART1    10/12/20207102021018880%
CUST1    10035987PART1    15/12/20207202021018880%
CUST1    10035987PART1    16/12/2020730202101816-8%
CUST1    10035987PART1    17/12/20207402021018160%
CUST1    10035987PART1    17/12/20207502021018160%

 

Your help will be really appreciate.

Thanks 

1 ACCEPTED SOLUTION
FreemanZ
Super User
Super User

hi @UKNSI-Powerbi 

you may try to add a calculated column like this:

Variance2 = 
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
    SUM(TableName[CUMULATION]),    
    ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
    TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)

 

tried and it worked like this:

FreemanZ_0-1673071444818.png

View solution in original post

5 REPLIES 5
Ashish_Mathur
Super User
Super User

Hi,

This calculated column formula works

=DIVIDE([CUMULATION],LOOKUPVALUE(data[CUMULATION],data[Date],CALCULATE(MAX(data[Date]),FILTER(data,data[Cust]=EARLIER(data[Cust])&&data[Seq]=EARLIER(data[Seq])&&data[Contract]=EARLIER(data[Contract])&&data[Part]=EARLIER(data[Part])&&data[Date]<EARLIER(data[Date]))),data[Cust],data[Cust],data[Seq],data[Seq],data[Contract],data[Contract],data[Part],data[Part],data[YYYYMM],data[YYYYMM]))-1

Hope this helps.

Untitled.png


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/

Using the same table I would like to find variance in a different way: net variance from a previous Rel N. and month(s). Here an example of calculation:

 

03 June 2022

Variance 202206 = (H6-H5)-P6

Variance 202207 = (I6-I5)-Q6-P6

Variance 202208 = (J6-J5)-R6-Q6-P6

Variance 202209 = (K6-K5)-S6-R6-Q6-P6

Variance 202210 = (L6-L5)-T6-S6-R6-Q6-P6

Variance 202211 = (M6-M5)-U6-T6-S6-R6-Q6-P6

Variance 202212 = (N6-N5)-V6-U6-T6-S6-R6-Q6-P6

 

Here is like it looks like.

 

 

UKNSIPowerbi_0-1673453201569.png

 

I have no idea of what you want.  I am as clear as mud. 


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/
FreemanZ
Super User
Super User

hi @UKNSI-Powerbi 

you may try to add a calculated column like this:

Variance2 = 
VAR _date = [Date]
VAR _cumpre =
CALCULATE(
    SUM(TableName[CUMULATION]),    
    ALLEXCEPT(TableName, TableName[Cust], TableName[Seq], TableName[Contract], TableName[Part], TableName[YYYYMM]),
    TableName[Date] = _date-1
)
RETURN
DIVIDE( [CUMULATION] - _cumpre, _cumpre)

 

tried and it worked like this:

FreemanZ_0-1673071444818.png

amitchandak
Super User
Super User

@UKNSI-Powerbi , Create a date table join with your date

 

and have measure like

This Day = CALCULATE(sum('Table'[Qty]))

 

This Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])))


Last Day = CALCULATE(sum('Table'[Qty]), FILTER(ALL('Date'),'Date'[Date]=max('Date'[Date])-1))


Last Day = CALCULATE(sum('Table'[Qty]), previousday('Date'[Date]))

 

and take a diff

Day Intelligence - Last day, last non continous day
https://medium.com/@amitchandak.1978/power-bi-day-intelligence-questions-time-intelligence-5-5-5c324...

Share with Power BI Enthusiasts: Full Power BI Video (20 Hours) YouTube
Microsoft Fabric Series 60+ Videos YouTube
Microsoft Fabric Hindi End to End YouTube

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors