We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hello to all,
I'm asking for your help concerning a data grouping according to a time base. Indeed, I would like to use Power Query to create data groupings based on a 10 minute time frame. As you can see on the image below, I have about ten values recorded per minute. I would like to be able to group these values on a 10 minutes basis and calculate the average.
Here is an example of the desired result in Excel.
I know the "Groups" function of Power BI but I would like to first reduce my file with Power Query before manipulating it in Power BI.
Thank you in advance for your help,
Have a nice day,
Joël
Solved! Go to Solution.
Hi @Anonymous
If I understand correctly, you have too many entries, so need to "compress" rows, here is one way with the assumption you do have the entry for each 10 minute, paste in Advanced Editor
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI/D4IwEIfhr2I6ixztlX8rg3E1ukiIYdGYOPr9Y5FLwXDl6Eqe0ML7a1v1aF63vnmfPle1V2BSyFMNOttBVmfgHhVadftVpkd2yJZwIAnkCZgfrGogy7w0aE2ExQhrI2weYQvphxmCpQSRYCWdjgWdLv6rSXK9/o+3BMVYtvIvFS6gHfRWiOUseivEcnb6sDFW/4T8cj+ei9CwDZVagZpguYSB/IZibbG4XVKsbVZHWBNhmftyw0aKtQKRIFMqMFfkYgUt04ubNsqx/LSZbwoM20qxZsO2UqzZsK0Uy1l/2yFW9wU=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [item = _t, dt = _t, value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"item", type text}, {"dt", type datetime}, {"value", type number}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{ {"item", Order.Ascending},{"dt", Order.Ascending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows", "group", each Number.From( DateTime.Date([dt]))*100+ Time.Hour([dt])*10+ Number.RoundDown( Time.Minute([dt])/10)),
#"Grouped Rows" = Table.Group(#"Added Custom", {"item", "group"}, {{"minDT", each List.Min([dt]), type nullable datetime}, {"avgValue", each List.Average([value]), type nullable number}}),
#"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"group"})
in
#"Removed Columns"
Hi @Anonymous ,
Just add the corresponding columns when grouping tables:
Best Regards,
Community Support Team _ Yingjie Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
To group time each 10 minutes and get the average value of it, you can try this query:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZZLbsMwDESvUngdIKY+ltVdeo0g979GU4Q0oEZvHCCrPAxNkRxR9/tyWy5LWtN6tWv7WtP3uj5/f/9ty+MicJbYANsLl11iM63uc5xeOBeJCwQPdZrj7OcGteMMBwsM6qIP5tiaxBlqXj04NNSxQdUCg3rz4JDapjPfdGrND1YlNuiY47rO8a4H2XGBsjjOELy7GoJ37ZIuZ83cY3le1APPhykweMxMZh4YMjdtYHOPpXlRA4OJDjwfB8s68zAwBD9RF515YGhJ0VXTFgwMl2bgNB9FCwvCOGy6YyfYPZagJU1nrh1q4TGo2v6RmlILh4K6629rA6ew4PxeS9rAgf1gP7Bi/WZCXCU2UNsw54RP1L4s3vC4YhE3iQuoR38jzhpDzcPAUNSwf5eYGlp1Q8PfcO6qMx8NjBjKsg3vljfctLrpcQgM/Y4NDEUdNzCqIXj/CNOkdtlvG5/QhCH4mdrkuW18YaN6PmuxYg1SS3JSjwUNmY9PaMJFY5iWfyuWMFgwMMz5saChLFUXVfs7ViwNk3ZoYAquLXjg+b0WOENqu27o+MImTKmFBSG1D/E8tTQ+ggnDwQ78/PbjFw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, dt = _t, value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Type", type text}, {"dt", type datetime}, {"value", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each Date.From([dt]) & Time.From(Number.RoundDown(144*Number.From(Time.From([dt])) / 1 ) / 144)),
#"Grouped Rows" = Table.Group(#"Added Custom", {"Type", "Custom"}, {{"avg", each List.Average([value]), type nullable number}}),
#"Renamed Columns" = Table.RenameColumns(#"Grouped Rows",{{"Custom", "dt"}}),
#"Changed Type1" = Table.TransformColumnTypes(#"Renamed Columns",{{"Type", type text}, {"dt", type datetime}, {"avg", type number}})
in
#"Changed Type1"
Best Regards,
Community Support Team _ Yingjie Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous
If I understand correctly, you have too many entries, so need to "compress" rows, here is one way with the assumption you do have the entry for each 10 minute, paste in Advanced Editor
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI/D4IwEIfhr2I6ixztlX8rg3E1ukiIYdGYOPr9Y5FLwXDl6Eqe0ML7a1v1aF63vnmfPle1V2BSyFMNOttBVmfgHhVadftVpkd2yJZwIAnkCZgfrGogy7w0aE2ExQhrI2weYQvphxmCpQSRYCWdjgWdLv6rSXK9/o+3BMVYtvIvFS6gHfRWiOUseivEcnb6sDFW/4T8cj+ei9CwDZVagZpguYSB/IZibbG4XVKsbVZHWBNhmftyw0aKtQKRIFMqMFfkYgUt04ubNsqx/LSZbwoM20qxZsO2UqzZsK0Uy1l/2yFW9wU=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [item = _t, dt = _t, value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"item", type text}, {"dt", type datetime}, {"value", type number}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{ {"item", Order.Ascending},{"dt", Order.Ascending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows", "group", each Number.From( DateTime.Date([dt]))*100+ Time.Hour([dt])*10+ Number.RoundDown( Time.Minute([dt])/10)),
#"Grouped Rows" = Table.Group(#"Added Custom", {"item", "group"}, {{"minDT", each List.Min([dt]), type nullable datetime}, {"avgValue", each List.Average([value]), type nullable number}}),
#"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"group"})
in
#"Removed Columns"
Hello @Anonymous and @v-yingjl thank you very much for your help!
Do you think it is possible to do the same thing by displaying 3 columns values:
- The average
- The min
- The Max
Thanks in advance,
Joël
Perfect, this is exactly what I wanted to do! Thank you so much for your generosity @Anonymous and @v-yingjl !
Have a nice day,
Sincerely
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 5 | |
| 3 | |
| 3 | |
| 2 | |
| 2 |
| User | Count |
|---|---|
| 10 | |
| 8 | |
| 7 | |
| 7 | |
| 5 |