The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi Power Query gurus - does anyone know if it is possible to get from a table that looks like this
Category | Start Date | End Date |
A | 01/01/2024 | 31/01/2024 |
A | 01/03/2024 | 31/03/2024 |
A | 01/04/2024 | 28/04/2024 |
A | 29/04/2024 | 01/06/2024 |
A | 01/08/2024 | |
B | 01/01/2024 | 31/01/2024 |
B | 01/06/2024 | 21/06/2024 |
to a table that looks like this
Category | Start Date | End Date | Grouping |
A | 01/01/2024 | 31/01/2024 | 1 |
A | 01/03/2024 | 31/03/2024 | 2 |
A | 01/04/2024 | 28/04/2024 | 2 |
A | 29/04/2024 | 01/06/2024 | 2 |
A | 01/08/2024 | 3 | |
B | 01/01/2024 | 31/01/2024 | 1 |
B | 01/06/2024 | 21/06/2024 | 2 |
I am looking to have a grouping field which groups by each Category and related rows where the start and end dates are continuous ie start date is 1 day after the previous end date ? If the next start date is more than one day after the previous end date then it should be a new Grouping
Any help greatly appreciated !
thanks
Solved! Go to Solution.
Amazing huge thanks @ThxAlot they work perfectly - though cant pretend to understand exactly what they are doing.
@johnfa it was an interesting question,
you can solve it by copy the below code and past it into the advaance editor.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjEyDHGMGJ1UHIGyPLG2PKm8DkjSzgHJi8kSWSPEixGaZ+C5i8AljYiYCznFBNAtqBZGwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Category = _t, #"Start Date" = _t, #"End Date" = _t]),
#"Changed Type1" = Table.TransformColumnTypes(Source,{{"Category", type text}, {"Start Date", type date}, {"End Date", type date}}),
#"Grouped Rows" = Table.Group(#"Changed Type1", {"Category"}, {{"Count", each [a=Table.ToRows(_[[Start Date],[End Date]]),b=List.Accumulate(List.Skip(List.Positions(a)),{1},(x,y)=> if a{y}{0}=a{y-1}{1}+Duration.From(1) then x&{List.Last(x)} else x &{ List.Last(x)+1}), c=Table.AddIndexColumn(_,"in",0,1),d=Table.AddColumn(c,"Grouping",(m)=> b{m[in]}) ][d]}}),
#"Expanded Count" = Table.ExpandTableColumn(#"Grouped Rows", "Count", {"Category", "Start Date", "End Date", "in", "Grouping"}, {"Category.1", "Start Date", "End Date", "in", "Grouping"})
in
#"Expanded Count"
Solution 1, concise but requires enough proficiency in PQ,
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjEyDHGMGJ1UHIGyPLG2PKm8DkjSzgHJi8kSWSPEixGaZ+C5g8WNSJgKucUA0CWoFkaiwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Category = _t, #"Start Date" = _t, #"End Date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Category", type text}, {"Start Date", type date}, {"End Date", type date}}, "fr"),
Rows = Table.ToRecords(#"Changed Type"),
#"Grouped inner" = List.Transform(List.Accumulate(Rows, {}, (s,c) => if (List.Last(List.Last(s,{}))[Category]? = c[Category]) and (List.Last(List.Last(s))[End Date] + #duration(1,0,0,0) = c[Start Date]) then List.RemoveLastN(s) & {List.Last(s) & {c}} else s & {{c}}), Table.FromRecords),
#"Extracted Category" = List.Transform(#"Grouped inner", each List.Distinct([Category]){0}),
#"Grouped outer" = Table.Group(Table.FromColumns({#"Extracted Category", #"Grouped inner"}, {"Category", "grp"}), "Category", {"grp", each Table.AddIndexColumn(_, "Category Index", 1, 1, Int64.Type)}),
#"Expanded outer" = Table.ExpandTableColumn(#"Grouped outer", "grp", {"grp", "Category Index"}),
#"Expanded inner" = Table.ExpandTableColumn(#"Expanded outer", "grp", {"Start Date", "End Date"})
in
#"Expanded inner"
Solution 2, a bit less demanding,
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjEyDHGMGJ1UHIGyPLG2PKm8DkjSzgHJi8kSWSPEixGaZ+C5g8WNSJgKucUA0CWoFkaiwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Category = _t, #"Start Date" = _t, #"End Date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Category", type text}, {"Start Date", type date}, {"End Date", type date}}, "fr"),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Previous End Date" = let end = #"Added Index"[End Date] in Table.AddColumn(#"Added Index", "Previous End Date", each try end{[Index]-1} otherwise null),
#"Grouped inner" = Table.Group(#"Added Previous End Date", {"Category","Start Date","Previous End Date"}, {"grp", each _}, 0, (x,y) => Byte.From(x[Category]<>y[Category] or (y[Start Date]<>y[Previous End Date] + #duration(1,0,0,0)))),
#"Grouped outer" = Table.Group(Table.RemoveColumns(#"Grouped inner",{"Start Date", "Previous End Date"}), "Category", {"grp", each Table.AddIndexColumn(_, "Category Index", 1, 1, Int64.Type)}),
#"Expanded outer" = Table.ExpandTableColumn(#"Grouped outer", "grp", {"grp", "Category Index"}),
#"Expanded inner" = Table.ExpandTableColumn(#"Expanded outer", "grp", {"Start Date", "End Date"})
in
#"Expanded inner"
Expertise = List.Accumulate( {Days as from Today}, {Skills and Knowledge}, (Current, Everyday) => Current & Day.LearnAndPractise(Everyday) ) |
Amazing huge thanks @ThxAlot they work perfectly - though cant pretend to understand exactly what they are doing.
Try adding the final parameter to your group step, GroupKind.Local.
--Nate