Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
I feel like the solution should be very simple but I'm stumped. Any help please?
I have two matrices (A and B). Each matrix has its own date filter/selection. Through interactions, I've managed to assign seperate date filters to each matrices. Depending on the dates selected, I want to calculate the variance between Matrix A and Matrix B.
In the diagram below, Matrix A is set for Qtr 4 2021 (this matrix will always show 1 date). Matrix B is selected for Qtr 1 and Qtr 2 of 2022 (this matrix will show one or more dates). I am having difficult calculating the variance in Matrix B. Because I've turned off interactions, I can't get SELECTEDVALUE to work in my measures.
Can I get some help please? Thank you 🙂
@TH_BI , You can try a measure below for variance
new measure =
var _max1 = maxx(allselected(Date),Date[Date])
var _min1 = minx(allselected(Date),Date[Date])
var _max2 = eomonth(_max1,2)
var _min = eomonth(_max1,-1* if( mod(Month(_max1),3) =0,3,mod(Month(_max1),3)))+1
var _max= eomonth(_min, 2)
return
if('Date'[Date]>= _min && 'Date'[Date] <= _max , "Latest Qtr" format([Date], "YYYY-QQ") )
return
calculate( sum(Table[Value]), filter('Table', 'Table'[Date] >=_min && 'Table'[Date] <=_max)) - calculate( sum(Table[Value]), filter('Table', 'Table'[Date] >=_min1 && 'Table'[Date] <=_max2))
Thanks Amit! I'll give it a try.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 63 | |
| 54 | |
| 44 | |
| 19 | |
| 14 |
| User | Count |
|---|---|
| 106 | |
| 105 | |
| 35 | |
| 26 | |
| 26 |