This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreLevel up your Power BI skills this month - build one visual each week and tell better stories with data! Get started
I feel like the solution should be very simple but I'm stumped. Any help please?
I have two matrices (A and B). Each matrix has its own date filter/selection. Through interactions, I've managed to assign seperate date filters to each matrices. Depending on the dates selected, I want to calculate the variance between Matrix A and Matrix B.
In the diagram below, Matrix A is set for Qtr 4 2021 (this matrix will always show 1 date). Matrix B is selected for Qtr 1 and Qtr 2 of 2022 (this matrix will show one or more dates). I am having difficult calculating the variance in Matrix B. Because I've turned off interactions, I can't get SELECTEDVALUE to work in my measures.
Can I get some help please? Thank you 🙂
@TH_BI , You can try a measure below for variance
new measure =
var _max1 = maxx(allselected(Date),Date[Date])
var _min1 = minx(allselected(Date),Date[Date])
var _max2 = eomonth(_max1,2)
var _min = eomonth(_max1,-1* if( mod(Month(_max1),3) =0,3,mod(Month(_max1),3)))+1
var _max= eomonth(_min, 2)
return
if('Date'[Date]>= _min && 'Date'[Date] <= _max , "Latest Qtr" format([Date], "YYYY-QQ") )
return
calculate( sum(Table[Value]), filter('Table', 'Table'[Date] >=_min && 'Table'[Date] <=_max)) - calculate( sum(Table[Value]), filter('Table', 'Table'[Date] >=_min1 && 'Table'[Date] <=_max2))
Thanks Amit! I'll give it a try.
Check out the April 2026 Power BI update to learn about new features.
Sign up to receive a private message when registration opens and key events begin.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 36 | |
| 29 | |
| 29 | |
| 21 | |
| 18 |
| User | Count |
|---|---|
| 71 | |
| 43 | |
| 33 | |
| 24 | |
| 23 |