Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hello-
I am trying to conditionally format values in cells of a matrix visual based on the summarization of the entire range of cells. In my specific case the columns of the matrix visual are weeks of the year which i want to compare vs. the Standard Deviation across the entire range of data and color code highlight if the value in the cell is outside the stad dev.
The measure that ive created to calculate the std dev across the range is:
stddv_time_bound = CALCULATE(
STDEVX.S(
SUMMARIZE( table1, table1[week], "Change",
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast 1")
-
CALCULATE(
SUM(table1[sum(column_values)]), table1[column_names] = "forecast2")
),
[Change]),
DATESINPERIOD(table1[week], TODAY(), 12, MONTH), All(table1[week]))
And i use this in the below measure to create the color coding.
Color Code = SWITCH(TRUE(),
[Change Forecast] >= [stddev_time_bound],"#de6a73",
"#B7e8a2")
The issue is that the std. dev measure is applied in the visual against only each indidual week which returns NaN due to no variability.
IS there a way to remove the inherent filter applied by the visual?
Thanks in advance for any help
Solved! Go to Solution.
Read about HASONEVALUE and REMOVEFILTERS
Very helpful. Thank you
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
77 | |
74 | |
57 | |
38 | |
33 |
User | Count |
---|---|
70 | |
63 | |
55 | |
48 | |
46 |