Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
joaocarlos95
Frequent Visitor

Subtract date/time in different rows with a condition

Hey guys,

 

I have a sheet with "Processess movement" (I have no idea how you call it in english, so I just free translated it :-)".

So, in each row of my sheet I have:

num of the protocol | department (that's sending the document) | date/time | department (that's recieving the document)| date/time 

 

Therfore, in order to find the time each department spends with their work, I have to subtract the red value (see image) with the blue value(see image) only if the yellow values (see image) are equal.

 

I found a solution with:

if [num_protocolo] = indexcreated[num_protocolo]{[Index]+1} then indexcreated[data_envio_origem]{[Index]+1}-[data_recebimento_destino] else 0),

 

But now everytime I apply my changes, it loads 20GB of data. My file has 16k rows, but it only has 1.7Mb, I don't know what´s happening.

 

If you could bring me a measure instead of a new column I would appreciate!!!

 

Image: https://prnt.sc/kve0km

1 ACCEPTED SOLUTION

Hi @joaocarlos95,

Based on my test, you could refer to below steps:

Add group index in query editor:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("hY9bCgMxCEW3EvI9EK+PTOJWhux/G00KTRnGUvDr6PHqdeXTulI+MhNaIRTSRN0FThJR0TyOwGoJ1Y2c66ZSwAnkWn3OhtZ7q87STXU2Fp27qP3MMrfukEfWsuq2cM/C8y+c6262dXdkoX37N5qo/dEQafyhY7wA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Id = _t, #"Time(send)" = _t, #"Time(recive)" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Id", Int64.Type}, {"Time(send)", type datetime}, {"Time(recive)", type datetime}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Id"}, {{"all data", each Table.AddIndexColumn(_, "Index",1,1), type table}}),
    #"Expanded all data" = Table.ExpandTableColumn(#"Grouped Rows", "all data", {"Time(send)", "Time(recive)", "Index"}, {"all data.Time(send)", "all data.Time(recive)", "all data.Index"}),
    #"Renamed Columns" = Table.RenameColumns(#"Expanded all data",{{"all data.Time(send)", "Time(send)"}, {"all data.Time(recive)", "Time(recive)"}, {"all data.Index", "Index"}})
in
    #"Renamed Columns"

Result:

1.PNG

Apply it and create two measures:

previous recive time per ID = CALCULATE(SELECTEDVALUE(Table1[Time(recive)]),FILTER(ALLEXCEPT(Table1,Table1[Id]),Table1[Index]=MAX(Table1[Index])-1))
time diff = DATEDIFF(SELECTEDVALUE(Table1[Time(send)]),[previous recive time per ID],MINUTE)

Result:

 1.PNG

 

You could also download the pbix file to have a view.

 

Regards,

Daniel He

Community Support Team _ Daniel He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

6 REPLIES 6
joaocarlos95
Frequent Visitor

Hey guys,

I have a sheet with "Processess movement" (I have no idea how you call it in english, so I just free translated it :-)".

So, in each row of my sheet I have: num of the protocol | department (that's sending the document) | date/time | department (that's recieving the document)| date/time

 

Therfore, in order to find the time each department spends with their work, I have to subtract the red value (see image) with the blue value(see image) only if the yellow values (see image) are equal.

 

I found a solution with:

if [num_protocolo] = indexcreated[num_protocolo]{[Index]+1} then indexcreated[data_envio_origem]{[Index]+1}-[data_recebimento_destino] else 0)

 

But now everytime I apply my changes, it loads 20GB of data. My file has 16k rows, but it only has 1.7Mb, I don't know what´s happening. If you could bring me a measure instead of a new column I would appreciate!!!

 

Image: https://prnt.sc/kve0km

Perhaps try it in DAX using EARLIER

 

See my article on Mean Time Before Failure (MTBF) which uses EARLIER: http://community.powerbi.com/t5/Community-Blog/Mean-Time-Between-Failure-MTBF-and-Power-BI/ba-p/3395...



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

Thank you for answering Greg, I don't know if you're a bot :-).

 

I was trying to use EARLIER, but I'm having a difficult time working with row contex, for example, I tried this: 

 

Measure = SUMX(FILTER(dProcessos;dProcessos[ID]=EARLIER(dProcessos[ID]));dProcessos[data_envio_origem]-EARLIER(dProcessos[data_recebimento_destino])

 

But I get that error in the second "EARLIER" that says that EARLIER makes reference to a row contenx that doesn't exist. I'm gonna read your article right now to see if I can finally understand EARLIER.

 

BTW, since you're already here :-), do you know why my file increases to 20GB when I use that M script? That script worked so bautifuly that I wanted to use it, but it takes 4 hours to apply my changes.

v-danhe-msft
Microsoft Employee
Microsoft Employee

Hi @joaocarlos95,

Based on my test, you could refer to below steps:

Sample data:

1.PNG

Create a measure:

Measure = var M=CALCULATE(MAX('Table1'[Time(send)]),FILTER(ALL('Table1'),'Table1'[Id]=MAX('Table1'[Id])))
          var N=CALCULATE(MIN('Table1'[Time(recive)]),FILTER(ALL('Table1'),'Table1'[Id]=MAX('Table1'[Id])))
return IF(M=MAX('Table1'[Time(send)]),DATEDIFF(M,N,DAY),0)

Result:

2.PNG

You could also download the pbix file to have a view.

 

Regards,

Daniel He

 

Community Support Team _ Daniel He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hey Daniel, thank you for answering. Unfortunately it didn't work on my case because I need to subtract from the previously row and there's another issue that in a lot of cases I have 30 rows with the same ID.

 

However, I really like your formula and I'm using it in another data :-).

Hi @joaocarlos95,

Based on my test, you could refer to below steps:

Add group index in query editor:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("hY9bCgMxCEW3EvI9EK+PTOJWhux/G00KTRnGUvDr6PHqdeXTulI+MhNaIRTSRN0FThJR0TyOwGoJ1Y2c66ZSwAnkWn3OhtZ7q87STXU2Fp27qP3MMrfukEfWsuq2cM/C8y+c6262dXdkoX37N5qo/dEQafyhY7wA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Id = _t, #"Time(send)" = _t, #"Time(recive)" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Id", Int64.Type}, {"Time(send)", type datetime}, {"Time(recive)", type datetime}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Id"}, {{"all data", each Table.AddIndexColumn(_, "Index",1,1), type table}}),
    #"Expanded all data" = Table.ExpandTableColumn(#"Grouped Rows", "all data", {"Time(send)", "Time(recive)", "Index"}, {"all data.Time(send)", "all data.Time(recive)", "all data.Index"}),
    #"Renamed Columns" = Table.RenameColumns(#"Expanded all data",{{"all data.Time(send)", "Time(send)"}, {"all data.Time(recive)", "Time(recive)"}, {"all data.Index", "Index"}})
in
    #"Renamed Columns"

Result:

1.PNG

Apply it and create two measures:

previous recive time per ID = CALCULATE(SELECTEDVALUE(Table1[Time(recive)]),FILTER(ALLEXCEPT(Table1,Table1[Id]),Table1[Index]=MAX(Table1[Index])-1))
time diff = DATEDIFF(SELECTEDVALUE(Table1[Time(send)]),[previous recive time per ID],MINUTE)

Result:

 1.PNG

 

You could also download the pbix file to have a view.

 

Regards,

Daniel He

Community Support Team _ Daniel He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.