Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.

Reply
webbta
Regular Visitor

Requesting help to group events by time

I have about two years worth of data with several thousand IDs. I filtered this example to just show one ID but there are many 1,000. I want to group events on this table by two-hour intervals by ID.

I want to create a column that looks at the earliest time within a two-hour window. The screenshot below shows what the column would look like if I had the logic right. When two hours passes from the first time, the next times are then grouped with that earliest time and so on until the day changes.


I prefer to create this column in Power Query but I am open to using DAX as well. 

How do I do that? Thank you so much!!

webbta_0-1733262733086.png

2 ACCEPTED SOLUTIONS

Hi @webbta ,

I create sample data myself, including different IDs and different days:

vjunyantmsft_0-1733293353656.png

Then you can put the whole M code below into the Advanced Editor:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddBLDsMgDATQq0SsI8U/HJhduq/UfZT7X6OkpUgNYf3wePC+B+aFeBESm2iFCKJP2zPMYQvH/M8JlMFpxA4zKE+ve15hjtixVM7nbqNreGVmUAR31X5caivUB+En5/JitJsRS/P05UfHXj59VhuyKshHnMDxZlobR4LUo/ZsMAXn0bR/wu3K1ljWkt/Cjzc=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type datetime}, {"ID", type text}}),
    FillEarliestTime = (table as table) as table =>
        let
            ListOfRecords = Table.ToRecords(table),
            init = {List.First(ListOfRecords)[Date]},
            accFunc = (acc as list, curr as record) =>
                let
                    lastTime = List.Last(acc),
                    duration = Duration.TotalMinutes(curr[Date] - lastTime)
                in
                    if duration <= 120 then
                        acc & {lastTime}
                    else
                        acc & {curr[Date]},
            ListOfEarliestTimes = List.Accumulate(List.Skip(ListOfRecords, 1), init, accFunc),
            FilledTable = Table.FromColumns(Table.ToColumns(table) & {ListOfEarliestTimes}, Table.ColumnNames(table) & {"Earliest Time"})
        in
            FilledTable,
    #"Grouped Table" = Table.Group(#"Changed Type", {"ID"}, {{"GroupedTable", each FillEarliestTime(_), type table [Date=nullable datetime, ID=nullable text, Earliest Time=nullable datetime]}}),
    #"Expanded GroupedTable" = Table.ExpandTableColumn(#"Grouped Table", "GroupedTable", {"Date", "Earliest Time"}, {"Date", "Earliest Time"})
in
    #"Expanded GroupedTable"

And the final output is as below:

vjunyantmsft_1-1733293452879.png


Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

Hi @tw2024, that means your data are not sorted as you showed as in your sample. Try this one and let me know:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddE9DoMwDAXgq6DMSPgvJnkb3St1R9z/Gg0tpYIk8xc/O/a6BuaJeBISG2iGCKIPyzOMYQnbeOUEyuDUY4cZlIdXm2eYI1YsB+e9t9E9/GBmUARXo/24jK1Q74TvnMuLXm9GLJOnLz8q9vLpfbQuq4K8xwkcG9V6ciTIsdSaDabg3Kv2T7jd2U6WueT3whPEGvfW/84v997e", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type datetime}, {"ID", type text}}, "en-US"),
    GroupedRows = Table.Group(ChangedType, {"ID"}, {{"All", each [
            // aDetail = GroupedRows{[ID="A"]}[All],
            aDetail = Table.Sort(_, {{"Date", Order.Ascending}}),
            aGroupedRows = Table.Group(aDetail, {"Date"}, {{"T", each Table.AddColumn(_, "Earliest Time", (x)=> Time.From([Date]{0}), type time), type table}}, 0,
                (x,y)=> Byte.From( (y[Date] - x[Date]) >= #duration(0,2,0,0) ) ),
            aCombinedT = Table.Combine(aGroupedRows[T])
          ][aCombinedT], type table}}),
    CombinedAll = Table.Combine(GroupedRows[All])

in
    CombinedAll

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

View solution in original post

12 REPLIES 12
dufoq3
Super User
Super User

Hi @webbta, check this:

 

Output

dufoq3_0-1733589058258.png

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddBLDsMgDATQq0SsI8U/HJhduq/UfZT7X6OkpUgNYf3wePC+B+aFeBESm2iFCKJP2zPMYQvH/M8JlMFpxA4zKE+ve15hjtixVM7nbqNreGVmUAR31X5caivUB+En5/JitJsRS/P05UfHXj59VhuyKshHnMDxZlobR4LUo/ZsMAXn0bR/wu3K1ljWkt/Cjzc=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type datetime}, {"ID", type text}}, "en-US"),
    GroupedRows = Table.Group(ChangedType, {"Date", "ID"}, {{"T", each Table.AddColumn(_, "Earliest Time", (x)=> Time.From([Date]{0}), type time), type table}}, GroupKind.Local,
        (x,y)=> Byte.From( (y[Date] - x[Date]) >= #duration(0,2,0,0) ) ),
    Combined = Table.Combine(GroupedRows[T])
in
    Combined

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

Hello! Thank you for responding. I tried this method but I was not able to get them to group properly because sometimes (using your example) the ID labeld B is mixed with the ID labeled A. So some of B will be before 2.11 and before 3.11 and the times are not showing properly for different ID types.

Hi @tw2024, that means your data are not sorted as you showed as in your sample. Try this one and let me know:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddE9DoMwDAXgq6DMSPgvJnkb3St1R9z/Gg0tpYIk8xc/O/a6BuaJeBISG2iGCKIPyzOMYQnbeOUEyuDUY4cZlIdXm2eYI1YsB+e9t9E9/GBmUARXo/24jK1Q74TvnMuLXm9GLJOnLz8q9vLpfbQuq4K8xwkcG9V6ciTIsdSaDabg3Kv2T7jd2U6WueT3whPEGvfW/84v997e", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Date", type datetime}, {"ID", type text}}, "en-US"),
    GroupedRows = Table.Group(ChangedType, {"ID"}, {{"All", each [
            // aDetail = GroupedRows{[ID="A"]}[All],
            aDetail = Table.Sort(_, {{"Date", Order.Ascending}}),
            aGroupedRows = Table.Group(aDetail, {"Date"}, {{"T", each Table.AddColumn(_, "Earliest Time", (x)=> Time.From([Date]{0}), type time), type table}}, 0,
                (x,y)=> Byte.From( (y[Date] - x[Date]) >= #duration(0,2,0,0) ) ),
            aCombinedT = Table.Combine(aGroupedRows[T])
          ][aCombinedT], type table}}),
    CombinedAll = Table.Combine(GroupedRows[All])

in
    CombinedAll

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

You were right - my data was structured in a way that did not show the full complexity of the model. Thank you so much for responding and updating the solution.

 

Your approach is streamlined and produces results quickly. Thank you again!

You're welcome. Ejnoy 😃


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

Omid_Motamedise
Super User
Super User

Hi @webbta 

This problem can be easily solve just by using the fifith argument of Table.Group

 

copy the below code and past it into the advance editor to see its magic

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddBLDsMgDATQq0SsI8U/HJhduq/UfZT7X6OkpUgNYf3wePC+B+aFeBESm2iFCKJP2zPMYQvH/M8JlMFpxA4zKE+ve15hjtixVM7nbqNreGVmUAR31X5caivUB+En5/JitJsRS/P05UfHXj59VhuyKshHnMDxZlobR4LUo/ZsMAXn0bR/wu3K1ljWkt/Cjzc=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type datetime}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", "Date", {{"Count", each _, type table [Date=nullable text, ID=nullable text]}},0,(a,b)=>Number.From((b-a)>#duration(0,2,0,0)))
in
    #"Grouped Rows"

 

Capture.PNG

This was the result for the above method - I wasn't able to see how to find the earliest time by ID and date.

webbta
Regular Visitor

I uploaded a pbx file with sample data to Google Drive. 

https://drive.google.com/file/d/1VR0i82Be3bnqIohWvZzdYzQgDhHyAKJI/view?usp=sharing

webbta
Regular Visitor

Thank you so much for answering! And I think I was sloppy with my first example and I messed up a cell or two. 

An example would be a list of IDs with a timestamp. I want to create a calculated column that finds the earliest time in the previous two hours.

If there is no time in the previous two hours, then that time becomes the new earliest time.

The next photo with the Earliest Time column heading shows the expected output of the calculated column based on the logic I have in mind.

Input File.PNGexpected output.PNG



Hi @webbta ,

I create sample data myself, including different IDs and different days:

vjunyantmsft_0-1733293353656.png

Then you can put the whole M code below into the Advanced Editor:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddBLDsMgDATQq0SsI8U/HJhduq/UfZT7X6OkpUgNYf3wePC+B+aFeBESm2iFCKJP2zPMYQvH/M8JlMFpxA4zKE+ve15hjtixVM7nbqNreGVmUAR31X5caivUB+En5/JitJsRS/P05UfHXj59VhuyKshHnMDxZlobR4LUo/ZsMAXn0bR/wu3K1ljWkt/Cjzc=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, ID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type datetime}, {"ID", type text}}),
    FillEarliestTime = (table as table) as table =>
        let
            ListOfRecords = Table.ToRecords(table),
            init = {List.First(ListOfRecords)[Date]},
            accFunc = (acc as list, curr as record) =>
                let
                    lastTime = List.Last(acc),
                    duration = Duration.TotalMinutes(curr[Date] - lastTime)
                in
                    if duration <= 120 then
                        acc & {lastTime}
                    else
                        acc & {curr[Date]},
            ListOfEarliestTimes = List.Accumulate(List.Skip(ListOfRecords, 1), init, accFunc),
            FilledTable = Table.FromColumns(Table.ToColumns(table) & {ListOfEarliestTimes}, Table.ColumnNames(table) & {"Earliest Time"})
        in
            FilledTable,
    #"Grouped Table" = Table.Group(#"Changed Type", {"ID"}, {{"GroupedTable", each FillEarliestTime(_), type table [Date=nullable datetime, ID=nullable text, Earliest Time=nullable datetime]}}),
    #"Expanded GroupedTable" = Table.ExpandTableColumn(#"Grouped Table", "GroupedTable", {"Date", "Earliest Time"}, {"Date", "Earliest Time"})
in
    #"Expanded GroupedTable"

And the final output is as below:

vjunyantmsft_1-1733293452879.png


Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Dino, Thank you tremendously! Your solution worked for me and I was able to incorporate it into my pbx.

lbendlin
Super User
Super User

Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).
Do not include sensitive information. Do not include anything that is unrelated to the issue or question.
Please show the expected outcome based on the sample data you provided.

Need help uploading data? https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...
Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebPBI_Carousel

Power BI Monthly Update - February 2025

Check out the February 2025 Power BI update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors