Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hello everyone,
I need your help please.
I have a table with several IDs and for each ID, I have several dates and times and also values.
normally I should have for each IDs hourly values, but I miss some hours.
so, for those missing hours, I need to add in Power query the missing rows and the values to be taken from last time we had a timestamp.
for example: if I have a row for 6/1/2024, 1:00AM and the next row for that ID is 6/2/2024, 3:00 PM, I need to add the missing rows for each hour between this interval and the values to be taken from 6/1/2024, 1:00 AM.
thank you
Solved! Go to Solution.
Filling down should come after replacing null with something else so the replacement is included in the filldown as well.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fdBNCoAgEAXgq4hrwxnHH2jXAYL24qLaWq26f0Vhgel2+HiPN97zcZobAEQuuJUoFSgtGKoWgHX9eUTgQZRYUuseY9m9aVRJo6T0j6KvGi5lfpR7lE7K3mpb1gZrKyssKVNGb5QrI6r25QspR/nA61nhAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ID = _t, #"DateTime Value" = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"DateTime Value", type datetime}, {"Value", Int64.Type}}, "en-US"),
#"Replaced Value" = Table.ReplaceValue(#"Changed Type",null,"this is null",Replacer.ReplaceValue,{"Value"}),
#"Grouped Rows" = Table.Group(#"Replaced Value", {"ID"}, {{"MinDateTime", each List.Min([DateTime Value]), type nullable datetime}, {"MaxDateTime", each List.Max([DateTime Value]), type nullable datetime}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "DateTime Value", each List.DateTimes([MinDateTime], Number.Round( (Number.From([MaxDateTime])-Number.From([MinDateTime])) * 24, 0) + 1, #duration(0,1,0,0))),
#"Expanded DateTime Value" = Table.ExpandListColumn(#"Added Custom", "DateTime Value"),
#"Removed Columns" = Table.RemoveColumns(#"Expanded DateTime Value",{"MinDateTime", "MaxDateTime"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Columns",{{"DateTime Value", type datetime}}),
#"Combined Original and Expanded Dates" = Table.Combine({#"Replaced Value", #"Changed Type1"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Combined Original and Expanded Dates",{{"Value", type text}}),
#"Sorted Rows" = Table.Sort(#"Changed Type2",{{"ID", Order.Ascending}, {"DateTime Value", Order.Ascending}, {"Value", Order.Descending}}),
#"Applied Table.Buffer" = Table.Buffer(#"Sorted Rows"),
#"Removed Duplicates" = Table.Distinct(#"Applied Table.Buffer", {"ID","DateTime Value"}),
#"Filled Down" = Table.FillDown(#"Removed Duplicates",{"Value"}),
#"Replaced Value1" = Table.ReplaceValue(#"Filled Down","this is null",null,Replacer.ReplaceValue,{"Value"}),
#"Changed Type3" = Table.TransformColumnTypes(#"Replaced Value1",{{"Value", Int64.Type}})
in
#"Changed Type3"
Glad it did. Could you please accept my pose as solution?
Hi @GiaD30 ,
What happens to the values of added rows? Also, please post a workable sample data (not an image)
Hi @GiaD30
Try this in the query editor
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fdA9CoAwDAXgq0jniknTH3DzAIK7OKizev9Ni5IKbbOGj/d4mWe1bnsLgKi08h12BozVDZoeoBnG54igFl1jrIygUhYJiljZgqK/mqJyBRU+ZVn5V13H2aK0UWCsXB2lqFBHJPblCylH+cD4rOUG", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ID = _t, #"DateTime Value" = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"DateTime Value", type datetime}, {"Value", Int64.Type}}, "en-US"),
#"Grouped Rows" = Table.Group(#"Changed Type", {"ID"}, {{"MinDateTime", each List.Min([DateTime Value]), type nullable datetime}, {"MaxDateTime", each List.Max([DateTime Value]), type nullable datetime}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "DateTime Value", each List.DateTimes([MinDateTime], Number.Round( (Number.From([MaxDateTime])-Number.From([MinDateTime])) * 24, 0) + 1, #duration(0,1,0,0))),
#"Expanded DateTime Value" = Table.ExpandListColumn(#"Added Custom", "DateTime Value"),
#"Removed Columns" = Table.RemoveColumns(#"Expanded DateTime Value",{"MinDateTime", "MaxDateTime"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Columns",{{"DateTime Value", type datetime}}),
#"Combined Original and Expanded Dates" = Table.Combine({#"Changed Type", #"Changed Type1"}),
#"Sorted Rows" = Table.Sort(#"Combined Original and Expanded Dates",{{"ID", Order.Ascending}, {"DateTime Value", Order.Ascending}, {"Value", Order.Descending}}),
#"Applied Table.Buffer" = Table.Buffer(#"Sorted Rows"),
#"Removed Duplicates" = Table.Distinct(#"Applied Table.Buffer", {"ID","DateTime Value"})
in
#"Removed Duplicates"
@danextian I checked now each step and it s working perfect until step Removed Columns, until that, I see all my missing rows, hour by hour, but after that I am losing it.
@danextian The code from you add a duplicate of each line I have. For example, if I have 3 linea for the same ID, with 6/1/2024, 1:00 AM, 6/2/2024 2:00 AM , 6/2/2024, 7:00 AM , the code will just duplicate those 3 lines, I need to return all missing rows between these intervals: 6/1/2024 2:00AM, 6/1/2024 3 AM, 6/1/2024, 4:00 AM etc
I checked on my code and they're not supposed to return duplicates caused I removed then in the process. You can see below that after 3 AM, 4 AM comes next and so on and so forth.
@danextian You were right, it worked 🙂 thank you very much. Now, the only think that you could help me, please, instead of null on the values for the rows I add, to add the value from the latest row I have in my initial table.
I did ask what to do with the values for the added rows. If you want copy the value from the latest non-blank row before the added row, you can right click the column and select fill down. It will fill every blanks with the value of the immediately preceding non-blank row.
@danextian Thank you very much, it works perfectly:). But I realized I missed to share one thing: my original table, before adding missing rows, has also rows where the values are null(blank). Now, when I add my missing rows, on the Values column, I need to take into consideration also the null . For example, If I have 4 rows with:
Date Values
6/1/2024, 1:00 AM 10
6/1/2024, 2:00 AM null
6/1/2024, 4:00 AM 20
6/1/2024, 6:00 AM 30
After adding the missing time
rows, I need to see like that:
Date Values
6/1/2024, 1:00 AM. 10
6/1/2024, 2:00 AM. null
6/1/2024, 3:00 AM. null
6/1/2024, 4:00 AM. 20
6/1/2024, 5:00 AM. 20
6/1/2024, 6:00 AM. 30
basically, The values from the missing rows I add to be copied from the previous row I had in my original table, no matter if it s null or it has value.
is there a way of doing this?
Thank you again for all the help from you:)
Prior to doing anything else, I would convert the data type of that value column to text and then replace null with something like "this is null". After filling down, rows with "this is null" should stiill be preserved. I would just revert that back to null by using the same replace method and change the data type back to number.
@danextian Thank you again. I tried, unfortunately, after all the steps, I still see null. Right at the beginning, I changed from number to text, I have replaced null with something else. Then I have all the steps to add missing rows and then I have my missing rows, but I only see null, even for those rows where I replaced null with somethng else
Filling down should come after replacing null with something else so the replacement is included in the filldown as well.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fdBNCoAgEAXgq4hrwxnHH2jXAYL24qLaWq26f0Vhgel2+HiPN97zcZobAEQuuJUoFSgtGKoWgHX9eUTgQZRYUuseY9m9aVRJo6T0j6KvGi5lfpR7lE7K3mpb1gZrKyssKVNGb5QrI6r25QspR/nA61nhAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ID = _t, #"DateTime Value" = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"DateTime Value", type datetime}, {"Value", Int64.Type}}, "en-US"),
#"Replaced Value" = Table.ReplaceValue(#"Changed Type",null,"this is null",Replacer.ReplaceValue,{"Value"}),
#"Grouped Rows" = Table.Group(#"Replaced Value", {"ID"}, {{"MinDateTime", each List.Min([DateTime Value]), type nullable datetime}, {"MaxDateTime", each List.Max([DateTime Value]), type nullable datetime}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "DateTime Value", each List.DateTimes([MinDateTime], Number.Round( (Number.From([MaxDateTime])-Number.From([MinDateTime])) * 24, 0) + 1, #duration(0,1,0,0))),
#"Expanded DateTime Value" = Table.ExpandListColumn(#"Added Custom", "DateTime Value"),
#"Removed Columns" = Table.RemoveColumns(#"Expanded DateTime Value",{"MinDateTime", "MaxDateTime"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Columns",{{"DateTime Value", type datetime}}),
#"Combined Original and Expanded Dates" = Table.Combine({#"Replaced Value", #"Changed Type1"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Combined Original and Expanded Dates",{{"Value", type text}}),
#"Sorted Rows" = Table.Sort(#"Changed Type2",{{"ID", Order.Ascending}, {"DateTime Value", Order.Ascending}, {"Value", Order.Descending}}),
#"Applied Table.Buffer" = Table.Buffer(#"Sorted Rows"),
#"Removed Duplicates" = Table.Distinct(#"Applied Table.Buffer", {"ID","DateTime Value"}),
#"Filled Down" = Table.FillDown(#"Removed Duplicates",{"Value"}),
#"Replaced Value1" = Table.ReplaceValue(#"Filled Down","this is null",null,Replacer.ReplaceValue,{"Value"}),
#"Changed Type3" = Table.TransformColumnTypes(#"Replaced Value1",{{"Value", Int64.Type}})
in
#"Changed Type3"
Glad it did. Could you please accept my pose as solution?
@danextian Thank you again, I will re check.
one other mention: on the values, for the missing rows that I add, I dont want to see null, I need to see the value from the last timestamp. For example: If I have a row with 6/1/2024 , 3:00 AM , value = 20, the next row is 6/2/6, 1:00 AM, value= 10, when I add my missing rows between these 2 dates, the value should be 20, from the last row I had a time stamp.
@danextian Thank you for your reply, the values will be used to calculate some other metrics, for me it;s important now to know how to add the missing rows.
I see I cannot attach an excel. I am new in this community. Should I create a table here, I see I have this option.
Thank you for your help 🙂