Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi there, I would like create the column "Move time", where I can check if the bench of a specific Excav changed and if it did change then calculate move time which will be previous LoadEndDateTime - current LoadStartDateTime. Please see Example below:
Thanks
Solved! Go to Solution.
Hi @Anonymous ,
Whether your problem has been resolved? If yes, could you please mark the helpful post as Answered? It will helpt the others find the solution easily if they face the same problem with you. Thank you.
Otherwise, you can try to achieve it in Power Query Editor by applying the below codes and check whether it still get the memory error. Please find the attachment for the details.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ndJNCoMwEAXgq0jWQubXJtnpNcT7X6PTWmiZsXToRsgjH/NG3fdCQFixMk7EQ3kAlNmFeobrutlTCB+HDagc84tTxQkGLh/4jAhylDjSJUl7oIw5yhKp2xX6k2qgLdL+NxXKFRb8Sb9Nlbir3HJTNU5VyVBznto5Vdju+cIWpT6O3fM/okUtVxgpUOQMbZf0+g1zmBoLe/re9bgD", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LoadStartDateTime = _t, #"LoadEndDateTime " = _t, ExcavType = _t, Excav = _t, Bench = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"LoadStartDateTime", type datetime}, {"LoadEndDateTime ", type datetime}, {"ExcavType", type text}, {"Excav", Int64.Type}, {"Bench", type text}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"LoadEndDateTime ", "LoadEndDateTime"}}),
#"Grouped Rows" = Table.Group(#"Renamed Columns", {"Excav"}, {{"Mindate", each List.Min([LoadStartDateTime]),type nullable datetime}, {"OriBench", each List.Min([Bench]), type nullable text}, {"Index", each Table.AddIndexColumn(_, "Index",0,1), type table}}),
#"Expanded Index" = Table.ExpandTableColumn(#"Grouped Rows", "Index", {"LoadStartDateTime", "LoadEndDateTime", "ExcavType", "Bench", "Index"}, {"LoadStartDateTime", "LoadEndDateTime", "ExcavType", "Bench", "Index"}),
#"Added Custom" = Table.AddColumn(#"Expanded Index", "Move time", each if [OriBench]=[Bench] then 0 else
[LoadStartDateTime]-( if [Index]=0 then null else let Excav_ = [Excav] in Table.SelectRows(#"Expanded Index", each [Excav] = Excav_ )[LoadEndDateTime]{[Index] - 1})),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Mindate", "OriBench", "Index"})
in
#"Removed Columns"
Best Regards
Hi @Anonymous ,
Whether your problem has been resolved? If yes, could you please mark the helpful post as Answered? It will helpt the others find the solution easily if they face the same problem with you. Thank you.
Otherwise, you can try to achieve it in Power Query Editor by applying the below codes and check whether it still get the memory error. Please find the attachment for the details.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ndJNCoMwEAXgq0jWQubXJtnpNcT7X6PTWmiZsXToRsgjH/NG3fdCQFixMk7EQ3kAlNmFeobrutlTCB+HDagc84tTxQkGLh/4jAhylDjSJUl7oIw5yhKp2xX6k2qgLdL+NxXKFRb8Sb9Nlbir3HJTNU5VyVBznto5Vdju+cIWpT6O3fM/okUtVxgpUOQMbZf0+g1zmBoLe/re9bgD", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LoadStartDateTime = _t, #"LoadEndDateTime " = _t, ExcavType = _t, Excav = _t, Bench = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"LoadStartDateTime", type datetime}, {"LoadEndDateTime ", type datetime}, {"ExcavType", type text}, {"Excav", Int64.Type}, {"Bench", type text}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"LoadEndDateTime ", "LoadEndDateTime"}}),
#"Grouped Rows" = Table.Group(#"Renamed Columns", {"Excav"}, {{"Mindate", each List.Min([LoadStartDateTime]),type nullable datetime}, {"OriBench", each List.Min([Bench]), type nullable text}, {"Index", each Table.AddIndexColumn(_, "Index",0,1), type table}}),
#"Expanded Index" = Table.ExpandTableColumn(#"Grouped Rows", "Index", {"LoadStartDateTime", "LoadEndDateTime", "ExcavType", "Bench", "Index"}, {"LoadStartDateTime", "LoadEndDateTime", "ExcavType", "Bench", "Index"}),
#"Added Custom" = Table.AddColumn(#"Expanded Index", "Move time", each if [OriBench]=[Bench] then 0 else
[LoadStartDateTime]-( if [Index]=0 then null else let Excav_ = [Excav] in Table.SelectRows(#"Expanded Index", each [Excav] = Excav_ )[LoadEndDateTime]{[Index] - 1})),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Mindate", "OriBench", "Index"})
in
#"Removed Columns"
Best Regards
Hi @Anonymous
Do you mean a DAX calculated column?
Move time =
VAR CurSDT=Table1[LoadStartDateTime]
VAR CurEDT =Table1[LoadEndDateTime]
VAR CurExcav = Table1[Excav]
VAR CurBench = Table1[Bench]
VAR T1= FILTER(Table1,Table1[Excav]=CurExcav&&Table1[LoadEndDateTime]<CurEDT)
VAR PreEDT = MAXX(T1,Table1[LoadEndDateTime])
VAR PreBench=COALESCE(MAXX(FILTER(T1,[LoadEndDateTime]=PreEDT),[Bench]),CurBench)
RETURN
IF(Table1[Bench]=PreBench,0,CurSDT-PreEDT)
I am getting a memory error:
Hi @Anonymous
Try to use it as a measure, then the column Move time to call this measure. Do you consider using M as well?
moveMeasure=
VAR CurSDT=MAX(Table1[LoadStartDT])
VAR CurEDT =MAX(Table1[LoadEndDT])
VAR CurExcav = MAX(Table1[Excav])
VAR CurBench = MAX([Bench])
VAR T1= FILTER(ALL(Table1),Table1[Excav]=CurExcav&&Table1[LoadEndDT]<CurEDT)
VAR PreEDT = MAXX(T1,Table1[LoadEndDT])
VAR PreBench=COALESCE(MAXX(FILTER(T1,[LoadEndDT]=PreEDT),[Bench]),CurBench)
RETURN
IF(MAX(Table1[Bench])=PreBench,0,CurSDT-PreEDT)
When I Create the Column "Move Time_c" by calling the measure "Move time_m", I get the same error. when I use the measure in the report in a table with filtered records (looking at less records) it works. It seems like the formula does not work on large data sets. If there's an M approach you can assist with, I can also try it.
Hi
I also tried to get the previous Bench in M but I'm getting a cyclic reference error.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 98 | |
| 72 | |
| 50 | |
| 49 | |
| 42 |