Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
mrbartuss
Helper I
Helper I

Add integer hours to date in Power Query or with DAX

My data looks like this:

NameDateTaskHours
John18/07/2024BBB3
John18/07/2024AAA5
John19/07/2024CCC8
John20/07/2024BBB3
John20/07/2024CCC3
John20/07/2024AAA2
Karin18/07/2024CCC8
Karin19/07/2024AAA4
Karin19/07/2024BBB4


I want to calculate StartDateTime and EndDateTime for each Name-Date-Task. It would be easy if there was only one row, but there can be multiple. Assumption: new task for the date always starts at 9am.

Here is what I want to achieve:

NameDateTaskStart DateTimeEnd DateTime
John18/07/2024BBB18/07/2024 9:00 AM18/07/2024 12:00 PM
John18/07/2024AAA18/07/2024 12:00 PM18/07/2024 5:00 PM
John19/07/2024CCC19/07/2024 9:00 AM19/07/2024 5:00 PM
John20/07/2024BBB20/07/2024 9:00 AM20/07/2024 12:00 PM
John20/07/2024CCC20/07/2024 12:00 PM20/07/2024 3:00 PM
John20/07/2024AAA20/07/2024 3:00 PM20/07/2024 5:00 PM
Karin18/07/2024CCC18/07/2024 9:00 AM18/07/2024 5:00 PM
Karin19/07/2024AAA19/07/2024 9:00 AM19/07/2024 1:00 PM
Karin19/07/2024BBB19/07/2024 1:00 PM19/07/2024 5:00 PM


I think it would be best to do it in Power Query as calculated columns in DAX are not recommended.

4 ACCEPTED SOLUTIONS
Ahmedx
Super User
Super User

Based on your description, I created data to reproduce your scenario. The pbix file is attached in the end.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type date}, {"Task", type text}, {"Hours", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Name", "Date"}, {{"Count", 
    (x)=>[a =Table.ToRecords(x),
     b =List.Generate (
    () => [ i = 0, result = a{i}[Date]&#time(9+a{i}[Hours],0,0) ], 
    ( x ) => x[i] < List.Count ( a ), 
    ( x ) => [ i = x[i] + 1, result = ( x[result] +#duration(0, a{i}[Hours],0,0) )  ], 
    ( x ) => Record.AddField ( a{x[i]}, "End DateTime", x[result] )
  ),
  c =  List.Transform(b,(x)=>  Record.AddField ( x,"Start DateTime",x[End DateTime]-#duration(0,x[Hours],0,0)  )),
  d = Table.FromRecords ( c )
   
  ][d]}}),
    #"Removed Other Columns" = Table.Combine( Table.SelectColumns(#"Grouped Rows",{"Count"})[Count]),
    #"Changed Type1" = Table.TransformColumnTypes(#"Removed Other Columns",{{"End DateTime", type datetime}, {"Start DateTime", type datetime}}),
    #"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Name", "Date", "Task", "Hours", "Start DateTime", "End DateTime"}),
    #"Removed Columns" = Table.RemoveColumns(#"Reordered Columns",{"Date"})
in
    #"Removed Columns"

View solution in original post

dufoq3
Super User
Super User

Hi @mrbartuss, another solution (List.Accumulate):

 

Result:

dufoq3_0-1721401562830.png

let
    fnDateTimes = 
        (myTable as table)=>
        [
            // _Detail = GroupedRows{2}[All],
            _Detail = myTable,
            _BufferedHours = List.Buffer(_Detail[Hours]),
            _AccumulatedDateTimes = List.Accumulate(
                {1..List.Count(_BufferedHours)-1},
                {{ _Detail{0}[Date] & #time(9,0,0), _Detail{0}[Date] & #time(9,0,0) + #duration(0,_BufferedHours{0},0,0) }},
                (s,c)=> s & {{ List.Last(s){1}, List.Last(s){1} + #duration(0,_BufferedHours{c},0,0) }}
            ),
            _RemovedColumns = Table.RemoveColumns(_Detail, {"Hours"}),
            _ToTable = Table.FromColumns(Table.ToColumns(_RemovedColumns) & Table.ToColumns(Table.FromRows(_AccumulatedDateTimes)), Value.Type(_RemovedColumns & #table(type table[Start DateTime=datetime, End DateTime=datetime], {})))
        ][_ToTable],

    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00Dcw1zcyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Hours", type number}, {"Date", type date}}, "sk-SK"),
    GroupedRows = Table.Group(ChangedType, {"Name", "Date"}, {{"fn", fnDateTimes, type table}}),
    Combined = Table.Combine(GroupedRows[fn])
in
    Combined

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

View solution in original post

mrbartuss
Helper I
Helper I

@Ahmedx @dufoq3 Thank you both very much!
Below is my solution. How would you rate all three solutions based on performance?

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type datetime}, {"Task", type text}, {"Hours", Int64.Type}}),
    Group = Table.Group(
    #"Changed Type",
    {"Name", "Date"},
    {{"All", each Table.AddIndexColumn(_, "Index", 1, 1, Int64.Type), type table}}
  ),
  Result = Table.AddColumn(Group, "Result", each let
        BuffValues = List.Buffer([All][Hours]),
        RunningTotal = List.Generate(
 
          // Start value: Set RT to 1st value in BuffValues and set RowIndex to 0
          () => [RT = BuffValues{0}, RowIndex = 0],
          // Condition: Generate values as long as RowIndex is < than number of list items
          each [RowIndex] < List.Count(BuffValues),
          // Record with logic for next values:
          each [
 
            // Calculate running total by adding current RT value and the next value
            RT = List.Sum({[RT], BuffValues{[RowIndex] + 1}}),
            // Increment the RowIndex by 1 for the next iteration
            RowIndex = [RowIndex] + 1
          ],
          // Only return the running total (RT) value for each row
          each [RT]
        ),
        CombinedTable = Table.FromColumns(
          Table.ToColumns([All]) & {Value.ReplaceType(RunningTotal, type {Int64.Type})},
          Table.ColumnNames([All]) & {"Running Total"}
        ),
        StartDateStage = Table.AddColumn(
          CombinedTable,
          "Start Date Stage",
          each [Date] + #duration(0, 9, 0, 0),
          type datetime
        ),
        EndDate = Table.AddColumn(
          StartDateStage,
          "End Date",
          each [Start Date Stage] + #duration(0, [#"Running Total"], 0, 0),
          type datetime
        ),
        StartDate = Table.AddColumn(
          EndDate,
          "Start Date",
          each if [Index] = 1 then [Start Date Stage] else EndDate[End Date]{[Index] - 2},
          type datetime
        ),
        Result = Table.SelectColumns(
          StartDate,
          {
            "Name",
            "Date",
            "Hours",
            "Task",
            "Start Date",
            "End Date"
          }
        )
      in
        Result),
  FinalResult = Table.Combine(Table.SelectColumns(Result, {"Result"})[Result])
in
    FinalResult

 

View solution in original post

orks a little slow

here is the difference in time

Screenshot_4.png

View solution in original post

6 REPLIES 6
mrbartuss
Helper I
Helper I

@Ahmedx @dufoq3 Thank you both very much!
Below is my solution. How would you rate all three solutions based on performance?

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type datetime}, {"Task", type text}, {"Hours", Int64.Type}}),
    Group = Table.Group(
    #"Changed Type",
    {"Name", "Date"},
    {{"All", each Table.AddIndexColumn(_, "Index", 1, 1, Int64.Type), type table}}
  ),
  Result = Table.AddColumn(Group, "Result", each let
        BuffValues = List.Buffer([All][Hours]),
        RunningTotal = List.Generate(
 
          // Start value: Set RT to 1st value in BuffValues and set RowIndex to 0
          () => [RT = BuffValues{0}, RowIndex = 0],
          // Condition: Generate values as long as RowIndex is < than number of list items
          each [RowIndex] < List.Count(BuffValues),
          // Record with logic for next values:
          each [
 
            // Calculate running total by adding current RT value and the next value
            RT = List.Sum({[RT], BuffValues{[RowIndex] + 1}}),
            // Increment the RowIndex by 1 for the next iteration
            RowIndex = [RowIndex] + 1
          ],
          // Only return the running total (RT) value for each row
          each [RT]
        ),
        CombinedTable = Table.FromColumns(
          Table.ToColumns([All]) & {Value.ReplaceType(RunningTotal, type {Int64.Type})},
          Table.ColumnNames([All]) & {"Running Total"}
        ),
        StartDateStage = Table.AddColumn(
          CombinedTable,
          "Start Date Stage",
          each [Date] + #duration(0, 9, 0, 0),
          type datetime
        ),
        EndDate = Table.AddColumn(
          StartDateStage,
          "End Date",
          each [Start Date Stage] + #duration(0, [#"Running Total"], 0, 0),
          type datetime
        ),
        StartDate = Table.AddColumn(
          EndDate,
          "Start Date",
          each if [Index] = 1 then [Start Date Stage] else EndDate[End Date]{[Index] - 2},
          type datetime
        ),
        Result = Table.SelectColumns(
          StartDate,
          {
            "Name",
            "Date",
            "Hours",
            "Task",
            "Start Date",
            "End Date"
          }
        )
      in
        Result),
  FinalResult = Table.Combine(Table.SelectColumns(Result, {"Result"})[Result])
in
    FinalResult

 

orks a little slow

here is the difference in time

Screenshot_4.png

Luckily I have a small dataset 😄 Thank again!

dufoq3
Super User
Super User

Hi @mrbartuss, another solution (List.Accumulate):

 

Result:

dufoq3_0-1721401562830.png

let
    fnDateTimes = 
        (myTable as table)=>
        [
            // _Detail = GroupedRows{2}[All],
            _Detail = myTable,
            _BufferedHours = List.Buffer(_Detail[Hours]),
            _AccumulatedDateTimes = List.Accumulate(
                {1..List.Count(_BufferedHours)-1},
                {{ _Detail{0}[Date] & #time(9,0,0), _Detail{0}[Date] & #time(9,0,0) + #duration(0,_BufferedHours{0},0,0) }},
                (s,c)=> s & {{ List.Last(s){1}, List.Last(s){1} + #duration(0,_BufferedHours{c},0,0) }}
            ),
            _RemovedColumns = Table.RemoveColumns(_Detail, {"Hours"}),
            _ToTable = Table.FromColumns(Table.ToColumns(_RemovedColumns) & Table.ToColumns(Table.FromRows(_AccumulatedDateTimes)), Value.Type(_RemovedColumns & #table(type table[Start DateTime=datetime, End DateTime=datetime], {})))
        ][_ToTable],

    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00Dcw1zcyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    ChangedType = Table.TransformColumnTypes(Source,{{"Hours", type number}, {"Date", type date}}, "sk-SK"),
    GroupedRows = Table.Group(ChangedType, {"Name", "Date"}, {{"fn", fnDateTimes, type table}}),
    Combined = Table.Combine(GroupedRows[fn])
in
    Combined

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

great, but List.Accumulate is a bit slow

Ahmedx
Super User
Super User

Based on your description, I created data to reproduce your scenario. The pbix file is attached in the end.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type date}, {"Task", type text}, {"Hours", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Name", "Date"}, {{"Count", 
    (x)=>[a =Table.ToRecords(x),
     b =List.Generate (
    () => [ i = 0, result = a{i}[Date]&#time(9+a{i}[Hours],0,0) ], 
    ( x ) => x[i] < List.Count ( a ), 
    ( x ) => [ i = x[i] + 1, result = ( x[result] +#duration(0, a{i}[Hours],0,0) )  ], 
    ( x ) => Record.AddField ( a{x[i]}, "End DateTime", x[result] )
  ),
  c =  List.Transform(b,(x)=>  Record.AddField ( x,"Start DateTime",x[End DateTime]-#duration(0,x[Hours],0,0)  )),
  d = Table.FromRecords ( c )
   
  ][d]}}),
    #"Removed Other Columns" = Table.Combine( Table.SelectColumns(#"Grouped Rows",{"Count"})[Count]),
    #"Changed Type1" = Table.TransformColumnTypes(#"Removed Other Columns",{{"End DateTime", type datetime}, {"Start DateTime", type datetime}}),
    #"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Name", "Date", "Task", "Hours", "Start DateTime", "End DateTime"}),
    #"Removed Columns" = Table.RemoveColumns(#"Reordered Columns",{"Date"})
in
    #"Removed Columns"

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.