Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
My data looks like this:
Name | Date | Task | Hours |
John | 18/07/2024 | BBB | 3 |
John | 18/07/2024 | AAA | 5 |
John | 19/07/2024 | CCC | 8 |
John | 20/07/2024 | BBB | 3 |
John | 20/07/2024 | CCC | 3 |
John | 20/07/2024 | AAA | 2 |
Karin | 18/07/2024 | CCC | 8 |
Karin | 19/07/2024 | AAA | 4 |
Karin | 19/07/2024 | BBB | 4 |
I want to calculate StartDateTime and EndDateTime for each Name-Date-Task. It would be easy if there was only one row, but there can be multiple. Assumption: new task for the date always starts at 9am.
Here is what I want to achieve:
Name | Date | Task | Start DateTime | End DateTime |
John | 18/07/2024 | BBB | 18/07/2024 9:00 AM | 18/07/2024 12:00 PM |
John | 18/07/2024 | AAA | 18/07/2024 12:00 PM | 18/07/2024 5:00 PM |
John | 19/07/2024 | CCC | 19/07/2024 9:00 AM | 19/07/2024 5:00 PM |
John | 20/07/2024 | BBB | 20/07/2024 9:00 AM | 20/07/2024 12:00 PM |
John | 20/07/2024 | CCC | 20/07/2024 12:00 PM | 20/07/2024 3:00 PM |
John | 20/07/2024 | AAA | 20/07/2024 3:00 PM | 20/07/2024 5:00 PM |
Karin | 18/07/2024 | CCC | 18/07/2024 9:00 AM | 18/07/2024 5:00 PM |
Karin | 19/07/2024 | AAA | 19/07/2024 9:00 AM | 19/07/2024 1:00 PM |
Karin | 19/07/2024 | BBB | 19/07/2024 1:00 PM | 19/07/2024 5:00 PM |
I think it would be best to do it in Power Query as calculated columns in DAX are not recommended.
Solved! Go to Solution.
Based on your description, I created data to reproduce your scenario. The pbix file is attached in the end.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type date}, {"Task", type text}, {"Hours", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Name", "Date"}, {{"Count",
(x)=>[a =Table.ToRecords(x),
b =List.Generate (
() => [ i = 0, result = a{i}[Date]&#time(9+a{i}[Hours],0,0) ],
( x ) => x[i] < List.Count ( a ),
( x ) => [ i = x[i] + 1, result = ( x[result] +#duration(0, a{i}[Hours],0,0) ) ],
( x ) => Record.AddField ( a{x[i]}, "End DateTime", x[result] )
),
c = List.Transform(b,(x)=> Record.AddField ( x,"Start DateTime",x[End DateTime]-#duration(0,x[Hours],0,0) )),
d = Table.FromRecords ( c )
][d]}}),
#"Removed Other Columns" = Table.Combine( Table.SelectColumns(#"Grouped Rows",{"Count"})[Count]),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Other Columns",{{"End DateTime", type datetime}, {"Start DateTime", type datetime}}),
#"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Name", "Date", "Task", "Hours", "Start DateTime", "End DateTime"}),
#"Removed Columns" = Table.RemoveColumns(#"Reordered Columns",{"Date"})
in
#"Removed Columns"
Hi @mrbartuss, another solution (List.Accumulate):
Result:
let
fnDateTimes =
(myTable as table)=>
[
// _Detail = GroupedRows{2}[All],
_Detail = myTable,
_BufferedHours = List.Buffer(_Detail[Hours]),
_AccumulatedDateTimes = List.Accumulate(
{1..List.Count(_BufferedHours)-1},
{{ _Detail{0}[Date] & #time(9,0,0), _Detail{0}[Date] & #time(9,0,0) + #duration(0,_BufferedHours{0},0,0) }},
(s,c)=> s & {{ List.Last(s){1}, List.Last(s){1} + #duration(0,_BufferedHours{c},0,0) }}
),
_RemovedColumns = Table.RemoveColumns(_Detail, {"Hours"}),
_ToTable = Table.FromColumns(Table.ToColumns(_RemovedColumns) & Table.ToColumns(Table.FromRows(_AccumulatedDateTimes)), Value.Type(_RemovedColumns & #table(type table[Start DateTime=datetime, End DateTime=datetime], {})))
][_ToTable],
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00Dcw1zcyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Hours", type number}, {"Date", type date}}, "sk-SK"),
GroupedRows = Table.Group(ChangedType, {"Name", "Date"}, {{"fn", fnDateTimes, type table}}),
Combined = Table.Combine(GroupedRows[fn])
in
Combined
@Ahmedx @dufoq3 Thank you both very much!
Below is my solution. How would you rate all three solutions based on performance?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type datetime}, {"Task", type text}, {"Hours", Int64.Type}}),
Group = Table.Group(
#"Changed Type",
{"Name", "Date"},
{{"All", each Table.AddIndexColumn(_, "Index", 1, 1, Int64.Type), type table}}
),
Result = Table.AddColumn(Group, "Result", each let
BuffValues = List.Buffer([All][Hours]),
RunningTotal = List.Generate(
// Start value: Set RT to 1st value in BuffValues and set RowIndex to 0
() => [RT = BuffValues{0}, RowIndex = 0],
// Condition: Generate values as long as RowIndex is < than number of list items
each [RowIndex] < List.Count(BuffValues),
// Record with logic for next values:
each [
// Calculate running total by adding current RT value and the next value
RT = List.Sum({[RT], BuffValues{[RowIndex] + 1}}),
// Increment the RowIndex by 1 for the next iteration
RowIndex = [RowIndex] + 1
],
// Only return the running total (RT) value for each row
each [RT]
),
CombinedTable = Table.FromColumns(
Table.ToColumns([All]) & {Value.ReplaceType(RunningTotal, type {Int64.Type})},
Table.ColumnNames([All]) & {"Running Total"}
),
StartDateStage = Table.AddColumn(
CombinedTable,
"Start Date Stage",
each [Date] + #duration(0, 9, 0, 0),
type datetime
),
EndDate = Table.AddColumn(
StartDateStage,
"End Date",
each [Start Date Stage] + #duration(0, [#"Running Total"], 0, 0),
type datetime
),
StartDate = Table.AddColumn(
EndDate,
"Start Date",
each if [Index] = 1 then [Start Date Stage] else EndDate[End Date]{[Index] - 2},
type datetime
),
Result = Table.SelectColumns(
StartDate,
{
"Name",
"Date",
"Hours",
"Task",
"Start Date",
"End Date"
}
)
in
Result),
FinalResult = Table.Combine(Table.SelectColumns(Result, {"Result"})[Result])
in
FinalResult
@Ahmedx @dufoq3 Thank you both very much!
Below is my solution. How would you rate all three solutions based on performance?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type datetime}, {"Task", type text}, {"Hours", Int64.Type}}),
Group = Table.Group(
#"Changed Type",
{"Name", "Date"},
{{"All", each Table.AddIndexColumn(_, "Index", 1, 1, Int64.Type), type table}}
),
Result = Table.AddColumn(Group, "Result", each let
BuffValues = List.Buffer([All][Hours]),
RunningTotal = List.Generate(
// Start value: Set RT to 1st value in BuffValues and set RowIndex to 0
() => [RT = BuffValues{0}, RowIndex = 0],
// Condition: Generate values as long as RowIndex is < than number of list items
each [RowIndex] < List.Count(BuffValues),
// Record with logic for next values:
each [
// Calculate running total by adding current RT value and the next value
RT = List.Sum({[RT], BuffValues{[RowIndex] + 1}}),
// Increment the RowIndex by 1 for the next iteration
RowIndex = [RowIndex] + 1
],
// Only return the running total (RT) value for each row
each [RT]
),
CombinedTable = Table.FromColumns(
Table.ToColumns([All]) & {Value.ReplaceType(RunningTotal, type {Int64.Type})},
Table.ColumnNames([All]) & {"Running Total"}
),
StartDateStage = Table.AddColumn(
CombinedTable,
"Start Date Stage",
each [Date] + #duration(0, 9, 0, 0),
type datetime
),
EndDate = Table.AddColumn(
StartDateStage,
"End Date",
each [Start Date Stage] + #duration(0, [#"Running Total"], 0, 0),
type datetime
),
StartDate = Table.AddColumn(
EndDate,
"Start Date",
each if [Index] = 1 then [Start Date Stage] else EndDate[End Date]{[Index] - 2},
type datetime
),
Result = Table.SelectColumns(
StartDate,
{
"Name",
"Date",
"Hours",
"Task",
"Start Date",
"End Date"
}
)
in
Result),
FinalResult = Table.Combine(Table.SelectColumns(Result, {"Result"})[Result])
in
FinalResult
orks a little slow
here is the difference in time
Luckily I have a small dataset 😄 Thank again!
Hi @mrbartuss, another solution (List.Accumulate):
Result:
let
fnDateTimes =
(myTable as table)=>
[
// _Detail = GroupedRows{2}[All],
_Detail = myTable,
_BufferedHours = List.Buffer(_Detail[Hours]),
_AccumulatedDateTimes = List.Accumulate(
{1..List.Count(_BufferedHours)-1},
{{ _Detail{0}[Date] & #time(9,0,0), _Detail{0}[Date] & #time(9,0,0) + #duration(0,_BufferedHours{0},0,0) }},
(s,c)=> s & {{ List.Last(s){1}, List.Last(s){1} + #duration(0,_BufferedHours{c},0,0) }}
),
_RemovedColumns = Table.RemoveColumns(_Detail, {"Hours"}),
_ToTable = Table.FromColumns(Table.ToColumns(_RemovedColumns) & Table.ToColumns(Table.FromRows(_AccumulatedDateTimes)), Value.Type(_RemovedColumns & #table(type table[Start DateTime=datetime, End DateTime=datetime], {})))
][_ToTable],
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00Dcw1zcyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Hours", type number}, {"Date", type date}}, "sk-SK"),
GroupedRows = Table.Group(ChangedType, {"Name", "Date"}, {{"fn", fnDateTimes, type table}}),
Combined = Table.Combine(GroupedRows[fn])
in
Combined
great, but List.Accumulate is a bit slow
Based on your description, I created data to reproduce your scenario. The pbix file is attached in the end.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTK00DMw1zMyMDIBcpycnICksVKsDnZ5R0dHIGmKIm+JJO/s7AwkLZDljQzwm48iD9GPWx5ivxFY3juxKBPDgcgOgCuwxDDBBLcCiBOBCmIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Date = _t, Task = _t, Hours = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Date", type date}, {"Task", type text}, {"Hours", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Name", "Date"}, {{"Count",
(x)=>[a =Table.ToRecords(x),
b =List.Generate (
() => [ i = 0, result = a{i}[Date]&#time(9+a{i}[Hours],0,0) ],
( x ) => x[i] < List.Count ( a ),
( x ) => [ i = x[i] + 1, result = ( x[result] +#duration(0, a{i}[Hours],0,0) ) ],
( x ) => Record.AddField ( a{x[i]}, "End DateTime", x[result] )
),
c = List.Transform(b,(x)=> Record.AddField ( x,"Start DateTime",x[End DateTime]-#duration(0,x[Hours],0,0) )),
d = Table.FromRecords ( c )
][d]}}),
#"Removed Other Columns" = Table.Combine( Table.SelectColumns(#"Grouped Rows",{"Count"})[Count]),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Other Columns",{{"End DateTime", type datetime}, {"Start DateTime", type datetime}}),
#"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Name", "Date", "Task", "Hours", "Start DateTime", "End DateTime"}),
#"Removed Columns" = Table.RemoveColumns(#"Reordered Columns",{"Date"})
in
#"Removed Columns"
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
15 | |
11 | |
8 | |
8 | |
7 |
User | Count |
---|---|
15 | |
13 | |
9 | |
6 | |
6 |