Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hello-
I am trying to conditionally format values in cells of a matrix visual based on the summarization of the entire range of cells. In my specific case the columns of the matrix visual are weeks of the year which i want to compare vs. the Standard Deviation across the entire range of data and color code highlight if the value in the cell is outside the stad dev.
The measure that ive created to calculate the std dev across the range is:
stddv_time_bound = CALCULATE(
STDEVX.S(
SUMMARIZE( table1, table1[week], "Change",
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast 1")
-
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast2")
),
[Change]),
DATESINPERIOD(table1[week], TODAY(), 12, MONTH), All(table1[week]))
And i use this in the below measure to create the color coding.
Color Code = SWITCH(TRUE(),
[Change Forecast] >= [stddev_time_bound],"#de6a73",
"#B7e8a2")
The issue is that the std. dev measure is applied in the visual against only each indidual week which returns NaN due to no variability.
IS there a way to remove the inherent filter applied by the visual?
Thanks in advance for any help
Solved! Go to Solution.
Read about HASONEVALUE and REMOVEFILTERS
Very helpful. Thank you
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 68 | |
| 59 | |
| 47 | |
| 20 | |
| 15 |
| User | Count |
|---|---|
| 105 | |
| 102 | |
| 38 | |
| 27 | |
| 26 |