Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
RRaj_293
Helper III
Helper III

Dax Help - Incorrect difference in time between 2 values of same datetime column

Hello,

 

I'm trying to find the difference in time(single column-META_CREATE_DATE) between the two process status - 'New Email' which is the start of the process and 'Complete' which is the end of the process for each 'GUID'. 

 

RRaj_293_0-1725589633858.png

These are the measures I have created to calculate the 'processing time'.

New Email Time =
VAR newemailtime = MIN('Tbl1'[META_CREATE_DATE])
RETURN
CALCULATE(newemailtime,FILTER('Tbl1','Tbl1'[PROCESS_STATUS] = "NEW EMAIL"))
 
Complete Status Time =
VAR completetime = MAX('Tbl1'[META_CREATE_DATE])
RETURN
CALCULATE(completetime, FILTER('Tbl1','Tbl1'[PROCESS_STATUS] = "COMPLETE"))
 
Processing Time = CONCATENATE(MINUTE([Complete Status Time]-[New Email Time]) & " Min " , SECOND([Complete Status Time]-[New Email Time])& " Sec")
 
The Issue is for some of the GUIDs both 'New Email Time' and 'Complete Status Time' is picking up the same datetime value which is of 'New Email' status. Ex- for GUID highlighted for both status calculations same value is being picked hence showing 0Min0Sec
RRaj_293_1-1725590343620.png

 

How do I fix this? I'm not sure what's wrong with my DAX! 

 

Any advice appreciated.

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi  @RRaj_293 ,

 

Here are the steps you can follow:

1. Create measure.

Group_Second =
var _new=MAXX(FILTER(ALL('Table'),'Table'[ID]=MAX('Table'[ID])&&'Table'[Process_Status]="NEW EMAIL"),[CREATE_DATETIME])
var _Com=
MAXX(FILTER(ALL('Table'),'Table'[ID]=MAX('Table'[ID])&&'Table'[Process_Status]="COMPLETE"),[CREATE_DATETIME])
RETURN
DATEDIFF(
    _new,_Com,SECOND)
CustomerIDtime =
FORMAT(TIME(0, 0, [Group_Second]), "HH:mm:ss")
EachCustomerIDtime =
var _table=
SUMMARIZE(ALLSELECTED('Table'),[Customer],[ID],"Group_S",[Group_Second])
var _avg=
AVERAGEX(
    _table,[Group_S])
return
FORMAT(TIME(0, 0, _avg), "HH:mm:ss")

2. Result:

vyangliumsft_0-1726217070015.png

 

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

View solution in original post

7 REPLIES 7
Anonymous
Not applicable

Hi  @RRaj_293 ,

 

Here are the steps you can follow:

1. Create measure.

Group_Second =
var _new=MAXX(FILTER(ALL('Table'),'Table'[ID]=MAX('Table'[ID])&&'Table'[Process_Status]="NEW EMAIL"),[CREATE_DATETIME])
var _Com=
MAXX(FILTER(ALL('Table'),'Table'[ID]=MAX('Table'[ID])&&'Table'[Process_Status]="COMPLETE"),[CREATE_DATETIME])
RETURN
DATEDIFF(
    _new,_Com,SECOND)
CustomerIDtime =
FORMAT(TIME(0, 0, [Group_Second]), "HH:mm:ss")
EachCustomerIDtime =
var _table=
SUMMARIZE(ALLSELECTED('Table'),[Customer],[ID],"Group_S",[Group_Second])
var _avg=
AVERAGEX(
    _table,[Group_S])
return
FORMAT(TIME(0, 0, _avg), "HH:mm:ss")

2. Result:

vyangliumsft_0-1726217070015.png

 

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

RRaj_293
Helper III
Helper III

Sample data below -

 

CREATE_DATETIMEIDCustomerProcess_StatusProcess_sequence
2024-07-02 15:17:37.120000 Z00272fd1 NEW EMAIL10
2024-07-02 15:17:42.357000 Z00272fd1 PROCESSING20
2024-07-02 15:17:48.191000 Z00272fd1 WORKQUEUE25
2024-07-02 15:18:02.865000 Z00272fd1ABCSTART30
2024-07-02 15:18:23.588000 Z00272fd1ABCEND40
2024-07-02 15:19:19.690000 Z00272fd1ABCPROCESSING50
2024-07-02 15:22:26.991000 Z00272fd1 OUTPUT60
2024-07-02 15:22:35.944000 Z00272fd1 COMPLETE100
2024-08-13 13:01:38.375000 Z00d8d10c NEW EMAIL10
2024-08-13 13:01:42.422000 Z00d8d10c PROCESSING20
2024-08-13 13:01:47.782000 Z00d8d10c WORKQUEUE25
2024-08-13 13:02:03.955000 Z00d8d10cWX V5START30
2024-08-13 13:02:22.293000 Z00d8d10cWX V5END40
2024-08-13 13:02:35.924000 Z00d8d10cWX V5PROCESSING50
2024-08-13 13:02:48.025000 Z00d8d10cWX V5PROCESSING50
2024-08-13 13:03:05.791000 Z00d8d10c OUTPUT60
2024-08-13 13:03:13.863000 Z00d8d10c COMPLETE100
2024-08-15 10:06:35.888000 Z0111fa1b NEW EMAIL10
2024-08-15 10:06:38.952000 Z0111fa1b PROCESSING20
2024-08-15 10:06:44.455000 Z0111fa1b WORKQUEUE25
2024-08-15 10:07:04.127000 Z0111fa1bWX V5START30
2024-08-15 10:07:27.007000 Z0111fa1bWX V5END40
2024-08-15 10:08:05.357000 Z0111fa1bWX V5PROCESSING50
2024-08-15 10:10:00.557000 Z0111fa1bWX V5VALIDATING51
2024-08-15 10:10:09.781000 Z0111fa1b OUTPUT60
2024-08-15 10:10:20.409000 Z0111fa1b COMPLETE100
2024-08-30 12:05:31.434000 Z01aa69b1 NEW EMAIL10
2024-08-30 12:05:40.284000 Z01aa69b1 PROCESSING20
2024-08-30 12:05:47.470000 Z01aa69b1 WORKQUEUE25
2024-08-30 12:07:03.840000 Z01aa69b1WSSTART30
2024-08-30 12:07:28.000000 Z01aa69b1WSEND40
2024-08-30 12:07:45.144000 Z01aa69b1WSPROCESSING50
2024-08-30 12:07:58.780000 Z01aa69b1WSPROCESSING50
2024-08-30 12:09:54.803000 Z01aa69b1WSVALIDATING51
2024-08-30 12:10:06.052000 Z01aa69b1 OUTPUT60
2024-08-30 12:10:14.235000 Z01aa69b1 COMPLETE100
2024-08-23 10:06:57.087000 Z02aa89c1 NEW EMAIL10
2024-08-23 10:07:07.210000 Z02aa89c1 PROCESSING20
2024-08-23 10:07:13.948000 Z02aa89c1 WORKQUEUE25
2024-08-23 10:08:03.390000 Z02aa89c1WSSTART30
2024-08-23 10:08:25.926000 Z02aa89c1WSEND40
2024-08-23 10:08:45.646000 Z02aa89c1WSPROCESSING50
2024-08-23 10:11:18.253000 Z02aa89c1WSVALIDATING51
2024-08-23 10:11:27.212000 Z02aa89c1 OUTPUT60
2024-08-23 10:11:35.421000 Z02aa89c1 COMPLETE100
2024-09-06 09:05:40.921000 Z04aa64e2 NEW EMAIL10
2024-09-06 09:06:01.608000 Z04aa64e2 PROCESSING20
2024-09-06 09:06:17.001000 Z04aa64e2 WORKQUEUE25
2024-09-06 09:09:02.830000 Z04aa64e2WSSTART30
2024-09-06 09:09:26.110000 Z04aa64e2WSEND40
2024-09-06 09:09:42.314000 Z04aa64e2WSPROCESSING50
2024-09-06 09:11:12.188000 Z04aa64e2WSVALIDATING51
2024-09-06 09:11:23.437000 Z04aa64e2 OUTPUT60
2024-09-06 09:11:31.624000 Z04aa64e2 COMPLETE100
2024-09-09 15:24:56.542000 Z0275fde2 NEW EMAIL10
2024-09-09 15:25:00.398000 Z0275fde2 PROCESSING20
2024-09-09 15:25:05.911000 Z0275fde2 WORKQUEUE25
2024-09-09 15:26:03.916000 Z0275fde2ATOMSTART30
2024-09-09 15:26:25.727000 Z0275fde2ATOMEND40
2024-09-09 15:26:44.953000 Z0275fde2ATOMPROCESSING50
2024-09-09 15:27:02.100000 Z0275fde2ATOMVALIDATING51
2024-09-09 15:27:14.796000 Z0275fde2 OUTPUT60
2024-09-09 15:27:24.007000 Z0275fde2 COMPLETE100
2024-08-26 14:24:26.619000 Z2776702 NEW EMAIL10
2024-08-26 14:24:29.152000 Z2776702 TERMINATED1000
2024-09-09 08:38:26.777000 Z02905c89 NEW EMAIL10
2024-09-09 08:38:35.286000 Z02905c89 PROCESSING20
2024-09-09 08:38:41.509000 Z02905c89 WORKQUEUE25
2024-09-09 08:39:02.969000 Z02905c89WSSTART30
2024-09-09 08:39:24.109000 Z02905c89WSEND40
2024-09-09 08:39:56.505000 Z02905c89WSPROCESSING50
2024-09-09 08:41:43.734000 Z02905c89WSVALIDATING51
2024-09-09 08:41:54.286000 Z02905c89 OUTPUT60
2024-09-09 08:42:03.232000 Z02905c89 COMPLETE100
2024-09-03 14:41:09.604000 Z029a9332 NEW EMAIL10
2024-09-03 14:41:14.782000 Z029a9332 PROCESSING20
2024-09-03 14:41:19.817000 Z029a9332 WORKQUEUE25
2024-09-03 14:42:03.433000 Z029a9332ERSTART30
2024-09-03 14:42:24.733000 Z029a9332EREND40
2024-09-03 14:43:06.855000 Z029a9332ERPROCESSING50
2024-09-03 14:43:09.370000 Z029a9332ERVALIDATING51
2024-09-03 14:43:18.406000 Z029a9332 OUTPUT60
2024-09-03 14:43:26.459000 Z029a9332 COMPLETE100
2024-05-28 14:42:35.791000 Z02a8b18b NEW EMAIL10
2024-05-28 14:42:38.940000 Z02a8b18b PROCESSING20
2024-05-28 14:42:41.271000 Z02a8b18b WORKQUEUE25
2024-05-28 14:43:02.017000 Z02a8b18bABCSTART30
2024-05-28 14:56:02.302000 Z02a8b18bABCSTART30
2024-05-28 14:43:19.137000 Z02a8b18bABCEND40
2024-05-28 14:56:19.324000 Z02a8b18bABCEND40
2024-05-28 14:43:38.362000 Z02a8b18bABCPROCESSING50
2024-05-28 14:56:57.784000 Z02a8b18bABCPROCESSING50
2024-05-28 14:56:58.880000 Z02a8b18bABCPROCESSING50
2024-05-28 15:00:28.284000 Z02a8b18b OUTPUT60
2024-05-28 15:00:35.505000 Z02a8b18b COMPLETE100
2024-08-08 10:45:58.656000 Z02ca90b5 NEW EMAIL10
2024-08-08 10:46:02.896000 Z02ca90b5 PROCESSING20
2024-08-08 10:46:08.027000 Z02ca90b5 WORKQUEUE25
2024-08-08 10:47:02.464000 Z02ca90b5ERSTART30
2024-08-08 10:47:20.555000 Z02ca90b5EREND40
2024-08-08 10:47:34.638000 Z02ca90b5ERPROCESSING50
2024-08-08 10:47:35.185000 Z02ca90b5ERPROCESSING50
2024-08-08 10:47:55.097000 Z02ca90b5ERPROCESSING50
2024-08-08 10:48:08.301000 Z02ca90b5 OUTPUT60
2024-08-08 10:48:16.588000 Z02ca90b5 COMPLETE100
2024-08-28 11:01:33.710000 Z02d42fb9 NEW EMAIL10
2024-08-28 11:01:37.644000 Z02d42fb9 PROCESSING20
2024-08-28 11:01:44.407000 Z02d42fb9 WORKQUEUE25
2024-08-28 11:29:23.294000 Z02d42fb9 OUTPUT60
2024-08-28 11:29:31.142000 Z02d42fb9 COMPLETE100
2024-08-26 09:13:31.262000 Z02e0eefe NEW EMAIL10
2024-08-26 09:13:36.453000 Z02e0eefe PROCESSING20
2024-08-26 09:13:42.871000 Z02e0eefe WORKQUEUE25
2024-08-26 09:14:03.104000 Z02e0eefeERSTART30
2024-08-26 09:14:26.322000 Z02e0eefeEREND40
2024-08-26 09:14:55.388000 Z02e0eefeERPROCESSING50
2024-08-26 09:14:56.666000 Z02e0eefeERPROCESSING50
2024-08-26 09:16:45.300000 Z02e0eefeERPROCESSING50
2024-08-26 09:16:47.847000 Z02e0eefeERVALIDATING51
2024-08-26 09:16:56.725000 Z02e0eefe OUTPUT60
2024-08-26 09:17:04.625000 Z02e0eefe COMPLETE100
2024-09-02 16:45:58.453000 Z030931d0 NEW EMAIL10
2024-09-02 16:46:02.388000 Z030931d0 PROCESSING20
2024-09-02 16:46:09.009000 Z030931d0 WORKQUEUE25
2024-09-02 16:47:04.181000 Z030931d0ATOMSTART30
2024-09-02 16:47:27.308000 Z030931d0ATOMEND40
2024-09-02 16:47:41.229000 Z030931d0ATOMPROCESSING50
2024-09-02 16:48:01.264000 Z030931d0ATOMPROCESSING50
2024-09-02 16:52:42.832000 Z030931d0ATOMPROCESSING50
2024-09-02 16:53:04.234000 Z030931d0ATOMVALIDATING51
2024-09-02 16:53:14.260000 Z030931d0 OUTPUT60
2024-09-02 16:53:22.330000 Z030931d0 COMPLETE100
2024-09-05 11:25:50.555000 Z034515c7 NEW EMAIL10
2024-09-05 11:20:52.018000 Z034515c7 TERMINATED1000
2024-06-04 14:59:22.039000 Z035b8aa0 NEW EMAIL10
2024-06-04 14:59:31.163000 Z035b8aa0 PROCESSING20
2024-06-04 14:59:34.654000 Z035b8aa0 WORKQUEUE25
2024-06-04 15:00:06.279000 Z035b8aa0ABCSTART30
2024-06-11 08:56:45.184000 Z035b8aa0ABCSTART30
2024-06-04 15:00:23.739000 Z035b8aa0ABCEND40
2024-06-04 15:01:14.813000 Z035b8aa0ABCPROCESSING50
2024-06-04 15:01:16.248000 Z035b8aa0ABCPROCESSING50
2024-06-04 15:03:05.333000 Z035b8aa0 PROCESSING50
2024-06-04 15:03:06.662000 Z035b8aa0ABCPROCESSING50
2024-06-05 01:17:26.474000 Z035b8aa0 PROCESSING50
2024-06-05 01:17:28.463000 Z035b8aa0ABCPROCESSING50
2024-06-05 01:29:25.230000 Z035b8aa0ABCTERMINATED1000
2024-08-21 17:01:57.255000 Z03ae8a31 NEW EMAIL10
2024-08-21 17:02:01.941000 Z03ae8a31 PROCESSING20
2024-08-21 17:02:08.763000 Z03ae8a31 WORKQUEUE25
2024-08-21 17:03:03.558000 Z03ae8a31WX V5START30
2024-08-21 17:03:25.693000 Z03ae8a31WX V5END40
2024-08-21 17:03:37.821000 Z03ae8a31WX V5PROCESSING50
2024-08-21 17:03:38.863000 Z03ae8a31WX V5PROCESSING50
2024-08-21 17:03:51.419000 Z03ae8a31WX V5PROCESSING50
2024-08-21 17:03:54.878000 Z03ae8a31WX V5VALIDATING51
2024-08-21 17:04:03.155000 Z03ae8a31 OUTPUT60
2024-08-21 17:04:13.791000 Z03ae8a31 COMPLETE100
RRaj_293
Helper III
Helper III

Thank you @Anonymous  and @lbendlin . I think I will start off fresh giving the correct requirement from start . 

 

Basically , In the sample data below - I want find the average time taken for prrocessing at a customer level. A process is said to be complete only when it starts with a 'New Email' , Process seq = 10 and ends with 'Complete' Status -Process seq = 100. If anything else its an incomplete process and can be considered as 0(processing time) for ease of calculation.

In the below sample data - Expected output - for Customer 'WX V5' , ID = '00d8d10c%' time taken for processing =

2024-08-13 13:03:13 (createdatetime where process status = complete) - 2024-08-13 13:01:38 (createdatetime where process status = New Email) = 0:02:20 (hh:mm:ss) . Similarly for each ID for each customer we need to calculate the time taken for each successfull cycle and then find the average time for each customer. One customer could process multiple IDs. Hence ID is the lowest grain.

 

 

RRaj_293_1-1726037757697.png

 

Please advice . Hope the sample data helps give better clarity.

 

Attaching the actual sample in below post due to char limit

 

Anonymous
Not applicable

Thanks for the reply from lbendlin , please allow me to provide another insight: 
Hi  @RRaj_293 ,

 

Here are the steps you can follow:

1. Create measure.

New Email Time =
MINX(
    FILTER(ALL('Tbl1'),
    'Tbl1'[GUID]=MAX('Tbl1'[GUID])&&'Tbl1'[PROCESS_STATUS]="NEW EMAIL"),[META_CREATE_DATE])
Complete Status Time =
MINX(
    FILTER(ALL('Tbl1'),
    'Tbl1'[GUID]=MAX('Tbl1'[GUID])&&'Tbl1'[PROCESS_STATUS]="COMPLETE"),[META_CREATE_DATE])
Processing Time  =
CONCATENATE(MINUTE([Complete Status Time]-[New Email Time]) & " Min " , SECOND([Complete Status Time]-[New Email Time])& " Sec")

2. Result:

vyangliumsft_0-1725859566290.png

 

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

Thanks @Anonymous  I did something very similar and its now resolved. 

 

However , Moving on with that solution I need to find the average processing time for each customer (total processing time/distinct count GUID) and display the final average time in Hours:Min:Sec. 

 

Here's is whats is done so far -

Fore each customer's GUID which goes through a process sequence from 10(Process status = New Email) and 100 (Process status = Complete) , I have managed to find the time difference in Min:Sec format and the ones which dont complete the cycle (i.e process status <> Complete as last step) its marked as "Incomplete process". Its calculation DAX is as below -

 

Test_ProcessTime = IF(Min(Tbl1[PROCESS_SEQUENCE])= 10 && Max(Tbl1[PROCESS_SEQUENCE])=100 , CONCATENATE(MINUTE([Complete Status Time]-[New Email Time]) & " Min " , SECOND([Complete Status Time]-[New Email Time])& " Sec"), "Incomplete Process")

 

I should now find the average time for each customer ignoring the "Incomplete process" ones. So I converted the time in Seconds and "Incomplete process" is made 0 for ease of calculation in the measure "Test_Totlprocessing TimeinSec"  .

 

Test_Totlprocessing TimeinSec = IF(Min(Tbl1[PROCESS_SEQUENCE])= 10 && Max(Tbl1[PROCESS_SEQUENCE])=100 , MINUTE([Complete Status Time]-[New Email Time])*60 + SECOND([Complete Status Time]-[New Email Time]),0)

 

Test_Avgprocessing Time = 
VAR cnt = DISTINCTCOUNT('Tbl1'[GUID])
RETURN
SUM(Test_Totlprocessing TimeinSec)/cnt ? 
 
Question - The average calculation is not working right. The seconds are not summing up correctly for each customer- Numerator SUM(Test_Totlprocessing TimeinSec) is giving incorrect result. Counts of distinct GUID (denominator) is correct but the sum of total processing time in seconds is not correct and hence average goes wrong. I'm not sure why its incorrect. For customers which have  'Incomplete process' the sum shows as 0. 

 

Please help! How do I find the correct average and display it in HH:MM:SS format?

 

CustomerNameGUIDTest_ProcessTimeTest_Totlprocessing TimeinSec
ABC00272fd14 Min 58 Sec298
ABC02a8b18b17 Min 59 Sec1079
ABC035b8aa0Incomplete Process0
ABC075ddfa8Incomplete Process0
ABC0bba233d2 Min 33 Sec153
XYZ1307af713 Min 24 Sec204
XYZ148fd7bd2 Min 8 Sec128
XYZ1655b3055 Min 18 Sec318
XYZ1be3283854 Min 13 Sec3253
XYZ1c3ab04e5 Min 11 Sec311

Your sample data is not covering your scenario.  None of the averages exceed 60 minutes.

 

lbendlin_0-1725884684121.png

 

lbendlin
Super User
Super User

Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).

Do not include sensitive information or anything not related to the issue or question.

If you are unsure how to upload data please refer to https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...

Please show the expected outcome based on the sample data you provided.

Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.