Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello,
This is the general structure of my data, where each row represents a single transaction currently:
ClientID | Transaction_Date | Amount |
aa | 01/01/20 | 10 |
aa | 01/02/20 | 15 |
aa | 01/05/20 | 13 |
bb | 02/06/20 | 11 |
bb | 02/02/20 | 4 |
What I would like to do is group transactions per client by consecutive dates, so client "aa" would show 2 transactions -- one for 25
and one for 13 ,because the first two transactions happen on consecutive dates, so they are counted as a single transaction, and client "bb" would show 2 transactions for 11 and 4.
I don't need to preserve the date data, what I'm trying to show is average transaction amounts by client where transactions with consecutive dates count as a single transaction.
Any help would be really appreciated, thanks!
Solved! Go to Solution.
Hi @Anonymous ,
You could try below M code to see whether it woks or not
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WSkxU0lEyMNQHIiMDINPQQClWB0nYCCpsiipsChU2BgsnJYGEjfQNzKDChqjCUENMlGJjAQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ClientID = _t, Transaction_Date = _t, Amount = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"ClientID", type text}, {"Transaction_Date", type date}, {"Amount", Int64.Type}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{{"Transaction_Date", Order.Ascending}}),
#"Grouped Rows" = Table.Group(#"Sorted Rows", {"ClientID"}, {{"ALL", each _, type table [ClientID=text, Transaction_Date=date, Amount=number]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "GROUP INDEX", each Table.AddIndexColumn([ALL], "INDEX", 0,1)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"ALL"}),
#"Expanded GROUP INDEX" = Table.ExpandTableColumn(#"Removed Columns", "GROUP INDEX", {"Transaction_Date", "Amount", "INDEX"}, {"Transaction_Date", "Amount", "INDEX"}),
#"Added Custom1" = Table.AddColumn(#"Expanded GROUP INDEX", "Custom", each try Date.AddDays([Transaction_Date],1)=#"Expanded GROUP INDEX"[Transaction_Date]{[INDEX]+1} otherwise null),
Custom1 = Table.ReplaceValue(#"Added Custom1", each [Custom], each if [Custom] = false or Value.Is([Custom], type date) then [Transaction_Date] else null , Replacer.ReplaceValue, {"Custom"}),
#"Filled Up" = Table.FillUp(Custom1,{"Custom"}),
#"Grouped Rows1" = Table.Group(#"Filled Up", {"ClientID", "Custom"}, {{"sum", each List.Sum([Amount]), type number}, {"all", each _, type table [ClientID=text, Transaction_Date=date, Amount=number, INDEX=number, Custom=date]}})
in
#"Grouped Rows1"
You could refer to extract-start-and-end-dates-with-power-query for details.
Best Regards,
Zoe Zhi
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
You could try below M code to see whether it woks or not
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WSkxU0lEyMNQHIiMDINPQQClWB0nYCCpsiipsChU2BgsnJYGEjfQNzKDChqjCUENMlGJjAQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [ClientID = _t, Transaction_Date = _t, Amount = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"ClientID", type text}, {"Transaction_Date", type date}, {"Amount", Int64.Type}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{{"Transaction_Date", Order.Ascending}}),
#"Grouped Rows" = Table.Group(#"Sorted Rows", {"ClientID"}, {{"ALL", each _, type table [ClientID=text, Transaction_Date=date, Amount=number]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "GROUP INDEX", each Table.AddIndexColumn([ALL], "INDEX", 0,1)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"ALL"}),
#"Expanded GROUP INDEX" = Table.ExpandTableColumn(#"Removed Columns", "GROUP INDEX", {"Transaction_Date", "Amount", "INDEX"}, {"Transaction_Date", "Amount", "INDEX"}),
#"Added Custom1" = Table.AddColumn(#"Expanded GROUP INDEX", "Custom", each try Date.AddDays([Transaction_Date],1)=#"Expanded GROUP INDEX"[Transaction_Date]{[INDEX]+1} otherwise null),
Custom1 = Table.ReplaceValue(#"Added Custom1", each [Custom], each if [Custom] = false or Value.Is([Custom], type date) then [Transaction_Date] else null , Replacer.ReplaceValue, {"Custom"}),
#"Filled Up" = Table.FillUp(Custom1,{"Custom"}),
#"Grouped Rows1" = Table.Group(#"Filled Up", {"ClientID", "Custom"}, {{"sum", each List.Sum([Amount]), type number}, {"all", each _, type table [ClientID=text, Transaction_Date=date, Amount=number, INDEX=number, Custom=date]}})
in
#"Grouped Rows1"
You could refer to extract-start-and-end-dates-with-power-query for details.
Best Regards,
Zoe Zhi
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
Try creating this calculated column:
If you consider it as a solution, please mark as a solution and kudos.
Ricardo
Hey,
I'm pretty new to DAX, when I tried creating that column, and replacing the table and field names, it gives me a whole bunch of error lines at and after the semi colons. What am I missing?
thanks,
Hey,
So it did correctly identify consecutive date transactions by client, but what I ultimately want to do is to get an output that shows average amount by transaction. How would I then group those transactions this formula is outputting as "True" into a single transaction?
Edit:
there can be multiple "true" outputs that should be counted distinctly -- if one client had 2 or more sets of consecutive transactions, I'd want those multiple consecutive sets to count as distinct transactions. So it's not just that I need all "True" rows per client to be summed.
thanks,
Hi @Anonymous ,
I've created this file as an example: Download PBIX
It has a group column, with that you can count how many group transactions you have.
If you consider it as a solution, please mark as a solution and kudos.
Ricardo
Thank you very much. You helped me!