The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hello-
I am trying to conditionally format values in cells of a matrix visual based on the summarization of the entire range of cells. In my specific case the columns of the matrix visual are weeks of the year which i want to compare vs. the Standard Deviation across the entire range of data and color code highlight if the value in the cell is outside the stad dev.
The measure that ive created to calculate the std dev across the range is:
stddv_time_bound = CALCULATE(
STDEVX.S(
SUMMARIZE( table1, table1[week], "Change",
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast 1")
-
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast2")
),
[Change]),
DATESINPERIOD(table1[week], TODAY(), 12, MONTH), All(table1[week]))
And i use this in the below measure to create the color coding.
Color Code = SWITCH(TRUE(),
[Change Forecast] >= [stddev_time_bound],"#de6a73",
"#B7e8a2")
The issue is that the std. dev measure is applied in the visual against only each indidual week which returns NaN due to no variability.
IS there a way to remove the inherent filter applied by the visual?
Thanks in advance for any help
Solved! Go to Solution.
Read about HASONEVALUE and REMOVEFILTERS
Very helpful. Thank you
User | Count |
---|---|
65 | |
55 | |
53 | |
49 | |
31 |
User | Count |
---|---|
180 | |
88 | |
70 | |
46 | |
45 |