Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hi,
I am trying to achive the following results (DesiredResults) through Power Query - using the calculation logic that is there.
I have used a List.Sum(Table.SelectRows) - as I am not trying to sum the entire column in my formula - only the last 2/3 years - however, I can't seem to achieve the desired results through PQ. I feel that this would likely be achived through looping (i.e. List.Generate or List.Accumulate) however, all examples that I have seen is a cumulative sum as opposed to a sum for the last 2-3 years. Is there a way to achieve this?
Below is my code and sample table - DesiredResults - is the desired results - CurrentPQResults is what I have obtained in PQ with the coding below:
let
Source = Excel.CurrentWorkbook(){[Name="Test2"]}[Content],
ChangedType = Table.TransformColumnTypes(Source,{{"Date", Int64.Type}, {"Device", Int64.Type}, {"Plan", Int64.Type}, {"DesiredResults", Int64.Type}, {"Calculation Logic", type text}}),
Add=Table.AddColumn(ChangedType, "CurrentPQResults",
(x)=>
let
xDev=x[Device],
xPlan=x[Plan],
xDate=x[Date],
rDev=List.Sum(Table.SelectRows(ChangedType, each ([Date] <=xDate and [Date] >=xDate-2))[Device]),
TempAdd1=
if xDev<>null then xDev
else
try if xPlan-rDev<0 then 0 else xPlan-rDev otherwise null
in
TempAdd1, Int64.Type)
in
Add
Solved! Go to Solution.
you are absolutely rigth!
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwtFTSUTI0QCJidUDiRiCeJUQIyoRKGAJ5YFlDMGkElzAC8oDIHCQKFzSGCJqCBI3hoiYQURNUK02R1IJEYwE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, device = _t, plan = _t, res = _t]),
q = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}}),
calcres=List.Accumulate({0..Table.RowCount(q)-1},{}, (s,c)=> s& {List.Max({0,q[plan]{c} - (List.Sum(List.LastN(s,2))??0)})})
ttc=Table.ToColumns(q),
nm=Table.ColumnNames(q),
tfc=Table.FromColumns(ttc&{calcres}, nm&{"calcres"})
in
tfc
I resolved my above issue by creating a function, and then invoking the function following a group by step. I have pasted both codes below as reference to the community - but I would still be interested in a solution that hacks the M code - without having to create a function - if possible, to deepen my M knowledge.
Here is the code that works:
The function (named fnListAccumulate) - which is essentially the code provided by @Rocco_sprmnt21 in the accepted solution - with minor changes to convert to a function (red) :
(GrpTbl as table)=>
let
calcres=List.Accumulate({0..Table.RowCount(GrpTbl)-1},{}, (s,c)=> s& {List.Max({0,GrpTbl[plan]{c} - (List.Sum(List.LastN(s,2))??0)})}),
ttc=Table.ToColumns(GrpTbl),
nm=Table.ColumnNames(GrpTbl),
tfc=Table.FromColumns(ttc&{calcres}, nm&{"calcres"})
in
tfc
Here is the original table invoking the above function after a group the desired fields (Scenario/Product):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZI7DsMgDIbvwpwBTJyEW1TqGGVq90q9/1AewTJgo0rEYfhC/gfnacC6YBbzfLk4H9/PO72dZeNaEgV2oEIBbNnemBuw/KXLEwiDAYtrTwwhXkIwIZ6YVWLWVjiq5xDTRZDEIT2if6j/WbHsRPtQY0RsJMFAxbW1sr3E5IRQsQ/MvuIemPvOPPzVP6cm/XNs0j/H6KwugY45bImqSaBjdukCdMw2iWDSP0f0/jml98+pqhoU97w4UNzfTGjvCErMURO6fg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, scenario = _t, product = _t, device = _t, plan = _t, res = _t]),
q = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}, {"scenario", type text}, {"product", type text}}),
#"Grouped Rows" = Table.Group(q, {"scenario", "product"}, {{"Allrows", each _, type table [date=nullable number, scenario=nullable text, product=nullable text, device=nullable number, plan=nullable number, res=nullable number]}}),
#"Invoked Custom Function" = Table.AddColumn(#"Grouped Rows", "ListAccumulate", each FnListAccumulate([Allrows])),
#"Removed Other Columns" = Table.SelectColumns(#"Invoked Custom Function",{"ListAccumulate"}),
#"Expanded ListAccumulate" = Table.ExpandTableColumn(#"Removed Other Columns", "ListAccumulate", {"date", "scenario", "product", "device", "plan", "res", "calcres"}, {"date", "scenario", "product", "device", "plan", "res", "calcres"})
in
#"Expanded ListAccumulate"
@Anonymous
As I am bringing this sample model into my actual model - I realized I missed some complexity... I need to do exactly as described above - but by group - i.e. in the example below - I would like to accumulate by group: i.e. Group 1 - scenario 1 and product 1, Group 2 - Scenario 2 Product 2, etc. Restarting the "s" (of the list accumulate function) between each group.
I am quasi-certain that the best (simple) way to achive this is to use the group function and all rows and pass the "allrows" table as the table in the list.accumulate function...
I used your code above (and updated the source to include the new columns/content) and added the group row function and updated the list.accumulate function (Text in red). However - with that code - I get the following error (which is likely due to row context): Expression.Error: We cannot convert a value of type List to type Table. I need help in updating the code below so it works! Thanks.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jdLBDoMgDAbgd+HsoVSq8hZLdjSetvuSvf9hIkhKoc2SiR6+Kf9f9t0h+Ogm93z5c318P+9098CWY0oKoVMxA8iPhfmOXf/014qVYcfO35pMJfOIUCJzNWFkQrtxUt9TjaggbY7KpRWA94cCGfnx7pHIiI95T4sRH1lDpMRHFl9Jjyy9CI9/zZ8rY/6cGfPnrL5LNCDMBrmppgFh1tEBEGYxKrDmz40+f670+XOlHH9BhsdfmNg2RCOzlU8dPw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, scenario = _t, product = _t, device = _t, plan = _t, res = _t]),
q = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}, {"scenario", type text}, {"product", type text}}),
GrpTbl = Table.Group(q, {"scenario", "product"}, {{"Allrows", each _, type table [date=nullable number, scenario=nullable text, product=nullable text, device=nullable number, plan=nullable number, res=nullable number]}}),
calcres=List.Accumulate({0..Table.RowCount(GrpTbl[Allrows])-1},{}, (s,c)=> s& {List.Max({0,GrpTbl[Allrows][plan]{c} - (List.Sum(List.LastN(s,2))??0)})}),
ttc=Table.ToColumns(GrpTbl),
nm=Table.ColumnNames(GrpTbl),
tfc=Table.FromColumns(ttc&{calcres}, nm&{"calcres"})
in
tfc
I resolved my above issue by creating a function, and then invoking the function following a group by step. I have pasted both codes below as reference to the community - but I would still be interested in a solution that hacks the M code - without having to create a function - if possible, to deepen my M knowledge.
Here is the code that works:
The function (named fnListAccumulate) - which is essentially the code provided by @Rocco_sprmnt21 in the accepted solution - with minor changes to convert to a function (red) :
(GrpTbl as table)=>
let
calcres=List.Accumulate({0..Table.RowCount(GrpTbl)-1},{}, (s,c)=> s& {List.Max({0,GrpTbl[plan]{c} - (List.Sum(List.LastN(s,2))??0)})}),
ttc=Table.ToColumns(GrpTbl),
nm=Table.ColumnNames(GrpTbl),
tfc=Table.FromColumns(ttc&{calcres}, nm&{"calcres"})
in
tfc
Here is the original table invoking the above function after a group the desired fields (Scenario/Product):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZI7DsMgDIbvwpwBTJyEW1TqGGVq90q9/1AewTJgo0rEYfhC/gfnacC6YBbzfLk4H9/PO72dZeNaEgV2oEIBbNnemBuw/KXLEwiDAYtrTwwhXkIwIZ6YVWLWVjiq5xDTRZDEIT2if6j/WbHsRPtQY0RsJMFAxbW1sr3E5IRQsQ/MvuIemPvOPPzVP6cm/XNs0j/H6KwugY45bImqSaBjdukCdMw2iWDSP0f0/jml98+pqhoU97w4UNzfTGjvCErMURO6fg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, scenario = _t, product = _t, device = _t, plan = _t, res = _t]),
q = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}, {"scenario", type text}, {"product", type text}}),
#"Grouped Rows" = Table.Group(q, {"scenario", "product"}, {{"Allrows", each _, type table [date=nullable number, scenario=nullable text, product=nullable text, device=nullable number, plan=nullable number, res=nullable number]}}),
#"Invoked Custom Function" = Table.AddColumn(#"Grouped Rows", "ListAccumulate", each FnListAccumulate([Allrows])),
#"Removed Other Columns" = Table.SelectColumns(#"Invoked Custom Function",{"ListAccumulate"}),
#"Expanded ListAccumulate" = Table.ExpandTableColumn(#"Removed Other Columns", "ListAccumulate", {"date", "scenario", "product", "device", "plan", "res", "calcres"}, {"date", "scenario", "product", "device", "plan", "res", "calcres"})
in
#"Expanded ListAccumulate"
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwtFTSUTI0QCJidUDiRiCeJUQIyoRKGAJ5YFlDMGkElzAC8oDIHCQKFzSGCJqCBI3hoiYQURNUK02R1IJEYwE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, device = _t, plan = _t, res = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}}),
ai = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(ai, "rescalc", each List.Max({[plan]-((try ai[res]{[Index]-1} otherwise 0) + (try ai[res]{[Index]-2} otherwise 0)),0}))
in
#"Added Custom"
Thanks.
If I understand it correctly - you have used the DesiredResults column to generate the Rescalc column.
If correct, I may not have been clear in my original message - but the desired column was created manually - just for the purpose of showing what I was looking to achieve - and should not be used to generate the Rescalc column.
Essentially from the table above only the Date, Device and Plan would be part of the intial table - and the calculated column is what I am trying to achieve entirely within PQ. I.e. The current row calculation of the calculated column is dependant on the calculation of the two previous row of that same calculated column.
you are absolutely rigth!
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwtFTSUTI0QCJidUDiRiCeJUQIyoRKGAJ5YFlDMGkElzAC8oDIHCQKFzSGCJqCBI3hoiYQURNUK02R1IJEYwE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date = _t, device = _t, plan = _t, res = _t]),
q = Table.TransformColumnTypes(Source,{{"date", Int64.Type}, {"device", Int64.Type}, {"plan", Int64.Type}, {"res", Int64.Type}}),
calcres=List.Accumulate({0..Table.RowCount(q)-1},{}, (s,c)=> s& {List.Max({0,q[plan]{c} - (List.Sum(List.LastN(s,2))??0)})})
ttc=Table.ToColumns(q),
nm=Table.ColumnNames(q),
tfc=Table.FromColumns(ttc&{calcres}, nm&{"calcres"})
in
tfc
Wow - Thank you. Such an elegant - simple solution - I don't think I would have gotten there on my own - but I understand what was done (so I should be able to replicate for other solutions).
I used the ealier version - which worked perfectly - but noticed you updated the code to make it more dynamic (i.e. on the List.Accumulate and seed). I sincerely appreciate this!
Side note - the updated code is missing a comma between the calcres Step and the ttc Step.
Consider that most of the difficulties (80%?) in many of the requests for assistance that are made in the forum is to understand the context and the desired result from the Original Poster 😁.
In your case, taking into account that you have been clear and precise in your request, you have earned an extra effort in providing you with a correct (possibly) and most "expendable" answer possible 😀.
The fact that you were able to promptly report the misunderstanding made everything easier.
Thank you for the kind words!
Cheers!