Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Let me first try to explain what I'm working with: I have two important columns for a Waterfall graph I need to put on the report, 'Tasks' and 'Time'. Main directive is simple, have an average of the time each equipment spends doing each task, each day (I also have a 'Date' table, but that relationship is properly setup... I think). 'Time' just tells the ammount of time you spend on an entry each time you do it so, for instance, if you were to have a 'Task' that says 'Forward', if you move forward for 5s, you'd have an entry of 'Time" that would state 5. If you then move another 5s forward, youd have two 'Forward' Tasks with 5s each, ammounting to a total of 10s that day, average of 5.
So if you now hopefully understand the basics of what I'm working with, which should be simple enough to do, this is what I'm having trouble with: each 'Task' is categorized under three categories: Operation, Standby or Unavailable. I have grouped the Tasks into these categories, and I have a Hierarchy where the Tasks are under the proper Categories. My problem is: if I use 'Average of Time' as the Y-axis and the Hierarchy as the Category, it works when you drill down to the Tasks, but if you drill up to the category groups, it says a completely different number, and I have no idea why, or how to fix it. Tried to group them a few different ways, but all end with the same unwanted result of warped number. My best guess is it's averaging the sum of times per task, then it's averaging the ammount of time is spent on each task on that category, and I need the average of the sum of time of these tasks that constitute the category.
Screenshots to illustrate:
This is the one that IS working. The machines filtered on the report worked about 10h on average on the filtered time span that, on average, was distributed on these Tasks. If I drill up, tho:
Same machines, same time span, same graph. I want it to show still the same 10h of average total time (as the data didnt change, why the numbers change?), just to show that SUCH AND SUCH Tasks, that are grouped under, say, Operation Time, ammount to how much togheter of that time, so say a manager can analyze and make better decisions on time allocation. Any thoughts?
Hi @rdjanousek
Based on your question, here's what I assume you might want to achieve:
Here's some dummy data
"Table"
It seems that you want to show the result after the drill returns to the previous level is the sum of the average task times under each category, please create a measure:
average time =
var total_category =
CALCULATE(
COUNTROWS('Table'),
FILTER(ALL('Table'),
'Table'[U.Category] = MAX('Table'[U.Category])
)
)
var avg_time =
SELECTEDVALUE(
'Table'[Time]
)
var task_average =
DIVIDE(
avg_time,
total_category
)
RETURN
IF(
ISINSCOPE('Table'[Task]),
task_average,
AVERAGE('Table'[Time])
)
Here is the result
If you're still having problems, provide some dummy data and the desired outcome. It is best presented in the form of a table.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Based on your question, here's what I assume you might want to achieve:
Here's some dummy data
"Table"
It seems like you want to show the result as the sum of the averages for each task in that category when you drill back to the previous level, please create a measure:
average time =
var total_category =
CALCULATE(
COUNTROWS('Table'),
FILTER(ALL('Table'),
'Table'[U.Category] = MAX('Table'[U.Category])
)
)
var avg_time =
SELECTEDVALUE(
'Table'[Time]
)
var task_average =
DIVIDE(
avg_time,
total_category
)
RETURN
IF(
ISINSCOPE('Table'[Task]),
task_average,
AVERAGE('Table'[Time])
)
Use the ISINSCOPE function to determine whether to drill to the next layer.
Here is the result
If you're still having problems, provide some dummy data and the desired outcome. It is best presented in the form of a table.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello, thanks for the response.
First problem I had with your sugestion is that when I type the first 'Table'[U.Category] that is supposed to equate to the max of itself, it only suggests 'Table'[Time] as valid for that operation.
Second problem I have is on the first image of results on your dummy example, it says the total average time is 14.1 while on the second is 5.3... If this is the average of times spent on every task, and the only thing that changes is the name were using for the tasks, how does the total time change from one picture to the other? I believe even if I was able to make your idea of Measure to work (which I couldnt, too dumb to understand it haha), we would bump into the same issue Im already having through other means... No?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
72 | |
71 | |
55 | |
37 | |
31 |
User | Count |
---|---|
89 | |
62 | |
61 | |
49 | |
45 |