March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hello,
I am trying to group my data into a continous series across 2 columns.
Emp ID | Day | Location | Logon Hour | Logoff Hour |
1 | 4/25/24 | Line 1 | 8 | 9 |
1 | 4/25/24 | Line 1 | 9 | 10 |
1 | 4/25/24 | Line 1 | 11 | 14 |
2 | 4/25/24 | Line 2 | 8 | 9 |
2 | 4/25/24 | Line 1 | 9 | 10 |
I need this to turn into
Emp Id | Day | Location | Logon Hour | Logoff Hour |
1 | 4/25/24 | Line 1 | 8 | 10 |
1 | 4/25/24 | Line 1 | 11 | 14 |
2 | 4/25/24 | Line 2 | 8 | 9 |
2 | 4/25/24 | Line 1 | 9 | 10 |
So it needs to be grouped by Emp Id, Day, Location, which is easy enough with the group by function. The hard part is getting the hours to merge if the logoff hour from another row is the same as the logon hour where id, day, and location also match. I tried playing around with groupkind.local function but had a hard time getting it to work over 2 columns
Solved! Go to Solution.
I thought of another solution. This should work for all cases.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTLRNzLVNzIBsnwy81IVQEIWQGypFKuDU4ElEBsa4FNhCCZMwEqMMJUYodiCRQGaLca4VVgSUoDfCKAskDDCq8QYQsTGAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Emp ID" = _t, Day = _t, Location = _t, #"Logon Hour" = _t, #"Logoff Hour" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Emp ID", Int64.Type}, {"Day", type date}, {"Location", type text}, {"Logon Hour", Int64.Type}, {"Logoff Hour", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Emp ID", "Day", "Location"}, {{"GroupData", each Table.SelectColumns(Table.AddIndexColumn(_, "Index", 0, 1), {"Logon Hour", "Logoff Hour", "Index"})}}),
Custom1 = Table.TransformColumns(#"Grouped Rows", {{"GroupData", each Table.Group(
Table.FromRecords(
List.Accumulate(
Table.ToRows(_),
{},
(state, current) =>
let
CurrentIndex = current{2},
CurrentLogon = current{0},
CurrentLogoff = current{1},
PreviousLogon = if CurrentIndex = 0 then null else List.Last(state)[Logon],
PreviousLogoff = if CurrentIndex = 0 then null else List.Last(state)[Logoff],
returnRecord = if CurrentIndex = 0 then [Logon=CurrentLogon, Logoff=CurrentLogoff] else if CurrentLogon = PreviousLogoff then [Logon=PreviousLogon, Logoff=CurrentLogoff] else [Logon=CurrentLogon, Logoff=CurrentLogoff]
in
state & {returnRecord}
)
), {"Logon"}, {{"Logoff", each List.Max([Logoff])}})
}}),
#"Expanded GroupData" = Table.ExpandTableColumn(Custom1, "GroupData", {"Logon", "Logoff"}, {"Logon", "Logoff"})
in
#"Expanded GroupData"
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Hello,
Thanks for the response 🙂
This seems to work for the most part, however it is running into issues when the user logs in more than once in the same hour. I believe this is because its looking for distinct numbers, which is an issue when there are duplicates like this. I do have the full timestamp of the logon/logoff such as hh:mm:ss am/pm so Ive been trying to fit that in so the values wouldn't be unique, but im having a hard time counting the values as continous in that case since the logoff wont match the next logon exactly, if that makes sense.
Emp ID | Day | Location | Logon Hour | Logoff Hour | Logon Real Time | Logoff Real Time |
1 | 4/25/24 | Line 1 | 9 | 9 | 9:05:13 AM | 9:28:39 AM |
1 | 4/25/24 | Line 1 | 9 | 10 | 9:45:11 AM | 10:55:23 AM |
1 | 4/25/24 | Line 1 | 11 | 14 | 11:33:11 AM | 2:43:21 PM |
I thought of another solution. This should work for all cases.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTLRNzLVNzIBsnwy81IVQEIWQGypFKuDU4ElEBsa4FNhCCZMwEqMMJUYodiCRQGaLca4VVgSUoDfCKAskDDCq8QYQsTGAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Emp ID" = _t, Day = _t, Location = _t, #"Logon Hour" = _t, #"Logoff Hour" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Emp ID", Int64.Type}, {"Day", type date}, {"Location", type text}, {"Logon Hour", Int64.Type}, {"Logoff Hour", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Emp ID", "Day", "Location"}, {{"GroupData", each Table.SelectColumns(Table.AddIndexColumn(_, "Index", 0, 1), {"Logon Hour", "Logoff Hour", "Index"})}}),
Custom1 = Table.TransformColumns(#"Grouped Rows", {{"GroupData", each Table.Group(
Table.FromRecords(
List.Accumulate(
Table.ToRows(_),
{},
(state, current) =>
let
CurrentIndex = current{2},
CurrentLogon = current{0},
CurrentLogoff = current{1},
PreviousLogon = if CurrentIndex = 0 then null else List.Last(state)[Logon],
PreviousLogoff = if CurrentIndex = 0 then null else List.Last(state)[Logoff],
returnRecord = if CurrentIndex = 0 then [Logon=CurrentLogon, Logoff=CurrentLogoff] else if CurrentLogon = PreviousLogoff then [Logon=PreviousLogon, Logoff=CurrentLogoff] else [Logon=CurrentLogon, Logoff=CurrentLogoff]
in
state & {returnRecord}
)
), {"Logon"}, {{"Logoff", each List.Max([Logoff])}})
}}),
#"Expanded GroupData" = Table.ExpandTableColumn(Custom1, "GroupData", {"Logon", "Logoff"}, {"Logon", "Logoff"})
in
#"Expanded GroupData"
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
This looks like its working! Thank you for your help Jing
This is my solution. Hope it would be helpful. You can create a blank query and paste below code into its Advanced Editor to see how it works.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTLRNzLVNzIBsnwy81IVQEIWQGypFKuDU4ElEBsa4FNhCCZMwEqMMJUYodiCRQGyLbEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Emp ID" = _t, Day = _t, Location = _t, #"Logon Hour" = _t, #"Logoff Hour" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Emp ID", Int64.Type}, {"Day", type date}, {"Location", type text}, {"Logon Hour", Int64.Type}, {"Logoff Hour", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Emp ID", "Day", "Location"}, {{"GroupData", each let allHours = List.Combine(Table.ToRows(Table.SelectColumns(_,{"Logon Hour", "Logoff Hour"}))), duplicateHours = Table.SelectRows(Table.FromRows(List.Transform(List.Distinct(allHours), (x)=> {x, List.Count(List.Select(allHours, each _ = x))}), {"value", "count"}), each [count]>1)[value] in Table.FromRows(List.Split(List.RemoveItems(allHours, duplicateHours), 2), {"Logon Hour","Logoff Hour"})}}),
#"Expanded GroupData" = Table.ExpandTableColumn(#"Grouped Rows", "GroupData", {"Logon Hour", "Logoff Hour"}, {"Logon Hour", "Logoff Hour"})
in
#"Expanded GroupData"
Final result
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
122 | |
89 | |
78 | |
67 | |
52 |
User | Count |
---|---|
199 | |
138 | |
96 | |
77 | |
67 |