The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi All, Thanks a lot for your time, I have this scenario where i get the data in the .csv file and this file is loaded into power BI
I have only 1 table in power BI (everything is in 1 table - meaing all my dimensioins and all my metrics like time duration in mins, sum of customers impacted etc) - the data is at the full details level ... for example
A service now ticket can have multiple problems and the problem can have multiple problem task numbers
1 service now ticket - many problem numbers or 1 problem number or there is no problem number at all
1 problem number - many problem task numbers or 1 problem task number or there is no task number at all
I have data at the problem task number level ... so the numeric columns/metrics are repeating for each row ..
what is the best approach to NOT ADD/SUM those so that i can show the high level views like
card - showing total impact duration - do not SUM because it will add all values - what is the best approach ?
bar char for department - show all impact duration - again dont sum the duplicates just take the unique
thanks a lot for your time
Date | Department | TicketNr | Problem_Num | Problem_Task_Num | Impact Start Date | Impact End Date | Duration In Mins | # of customers impacted |
03/12/21 | Marketing | IT101 | P101 | PTASK101 | 2021-03-10 13:47:00:000 | 2021-03-10 14:46:00:000 | 59 | 100 |
03/12/21 | Marketing | IT101 | P101 | PTASK101 | 2021-03-10 13:47:00:000 | 2021-03-10 14:46:00:000 | 59 | 100 |
03/15/21 | Marketing | IT102 | P102 | PTASK102 | 2021-03-12 10:15:00:000 | 2021-03-12 10:45:00:000 | 30 | 50 |
03/15/21 | Marketing | IT102 | P102 | PTASK102 | 2021-03-12 10:15:00:000 | 2021-03-12 10:45:00:000 | 30 | 50 |
03/15/21 | Sales | ITM201 | P201 | PTASK201 | 2021-03-13 11:10:00:000 | 2021-03-13 11:35:00:000 | 25 | 20 |
03/16/21 | Sales | ITM201 | P201 | PTASK201 | 2021-03-13 11:10:00:000 | 2021-03-13 11:35:00:000 | 25 | 20 |
03/16/21 | Sales | ITM301 | P301 | PTASK301 | 2021-03-14 12:15:00:000 | 2021-03-13 12:50:00:000 | 35 | 100 |
Solved! Go to Solution.
@Raj007
You can create a measure to only sum the distinct duration, check the example:
Paul Zheng _ Community Support Team
If this post helps, please Accept it as the solution to help the other members find it more quickly.
@Raj007
You can create a measure to only sum the distinct duration, check the example:
Paul Zheng _ Community Support Team
If this post helps, please Accept it as the solution to help the other members find it more quickly.
Hi, V-pazhen-msft,
Thanks a lot, sorry for the very late response. this is awesome trick using the distinct and sumx
really appreciate your time
@Raj007 Couple ways to approach this. One, use AVERAGE instead of SUM. Two, use DISTINCT('Table') and then do a calculation across it like SUMX, etc. I'm not clear on why the "duplicates" exist to begin with and I'm sure I'm missing something important here, but this is my best guess with the information provided.
User | Count |
---|---|
71 | |
63 | |
60 | |
49 | |
26 |
User | Count |
---|---|
117 | |
75 | |
62 | |
55 | |
43 |