Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
I'm working on a dashboard that shows the time allocated to specific applications against the average time spent on previously completed jobs.
I have a calculated table "Calculated Time Total" which summarizes the timesheet tables into the following table format:
Application_Key | Total Time |
9 | 15.49 |
38 | 6.78 |
105 | 4.10 |
123 | 14.28 |
I have created a measure with the following DAX code:
"Average Time = AVERAGE('Calculated Time Total'[Total Time])"
I have then created a further measure called "Average Flag" which is intended to be used to categorise a pie chart on the visual to separate over and underaverage jobs with the following DAX:
"AverageFlag =
IF(
MAX('Calculated Time Total'[Total Time]) > [Average Time] ,"Over","Under"
)"
Everything parses fine but when I go to my visuals and load the measures into tables I get Average Time showing as the first entry containing a value only:
No matter what application I look at the AverageFlag seems to be comparing the total for each row with the average of it's own total rather than the column total.
I'm at a point where I can't figure out what to try next.
Note that not all applications will have timesheets loaded against them so there are blanks in the data set.
Any help appreciated.
Solved! Go to Solution.
Hi @SimeonMcLean ,
Thank you for reaching out to the Microsoft fabric community forum. Also thanks @Jihwan_Kim for your input.
Based on your description and the visual, the [Average Time] measure is being calculated within the row context of your table visual, which is why it's returning only one value or behaving inconsistently. As a result, the [AverageFlag] measure is comparing each row against an incorrect average.
To fix this, you need to calculate the average across the full dataset (or selection) not per row. Please update your [Average Time] measure to the following:
Average Time =
AVERAGEX(
ALLSELECTED('Calculated Time Total'),
'Calculated Time Total'[Total Time]
)
This ensures the average is calculated once over the entire selected data, so your [AverageFlag] logic will work correctly:
AverageFlag =
IF(
MAX('Calculated Time Total'[Total Time]) > [Average Time],
"Over",
"Under"
)
This should give you the expected result in your visuals comparing each application's time against the overall average.
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Hi @SimeonMcLean ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @SimeonMcLean ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @SimeonMcLean ,
Thank you for reaching out to the Microsoft fabric community forum. Also thanks @Jihwan_Kim for your input.
Based on your description and the visual, the [Average Time] measure is being calculated within the row context of your table visual, which is why it's returning only one value or behaving inconsistently. As a result, the [AverageFlag] measure is comparing each row against an incorrect average.
To fix this, you need to calculate the average across the full dataset (or selection) not per row. Please update your [Average Time] measure to the following:
Average Time =
AVERAGEX(
ALLSELECTED('Calculated Time Total'),
'Calculated Time Total'[Total Time]
)
This ensures the average is calculated once over the entire selected data, so your [AverageFlag] logic will work correctly:
AverageFlag =
IF(
MAX('Calculated Time Total'[Total Time]) > [Average Time],
"Over",
"Under"
)
This should give you the expected result in your visuals comparing each application's time against the overall average.
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Hi @SimeonMcLean ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi,
I am not sure how your semantic model looks like, but please try something like below for the average calculation.
Average Time =
AVERAGEX (
ALLSELECTED ( 'Calculated Time Total' ),
'Calculated Time Total'[Total Time]
)
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
10 | |
9 | |
9 | |
8 | |
8 |
User | Count |
---|---|
14 | |
12 | |
11 | |
11 | |
8 |