Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
samahiji
Frequent Visitor

Running total breaks at each date

I've created function that returns the running total for all dates. I want to break the calculation for each date and then pick the rows that corresponds to first Cumm G above 100 similar to the output table below:

 

RT.JPG

 

let
    Source = (RT_Name1 as text, RT_Name2 as text, MyTable as table, RT_ColumnName1 as text, RT_ColumnName2 as text) =>
let
    
        Source2 = MyTable,
    
    
    BufferedValues1 = List.Buffer(Table.Column(MyTable, RT_ColumnName1)),
    BufferedValues2 = List.Buffer(Table.Column(MyTable, RT_ColumnName2)),
    RunningTotalList1 = List.Generate(

() =>[RunningTotal1= BufferedValues1{0},Counter=0],
each [Counter]<List.Count(BufferedValues1),
each [RunningTotal1 = [RunningTotal1] + BufferedValues1{[Counter]+1}, Counter = [Counter]+1],
each [RunningTotal1]
),
    RunningTotalList2 = List.Generate(
() =>[RunningTotal2= BufferedValues2{0},Counter=0],
each [Counter]<List.Count(BufferedValues2),
each [RunningTotal2 = [RunningTotal2] + BufferedValues2{[Counter]+1}, Counter = [Counter]+1],
each [RunningTotal2]
),
    Consolidation = Table.FromColumns(
Table.ToColumns(Source2) & {Value.ReplaceType(RunningTotalList1, type{Int64.Type})} & {Value.ReplaceType(RunningTotalList2, type{Int64.Type})},

Table.ColumnNames(Source2) & {RT_Name1} & {RT_Name2})
in
    Consolidation
in
    Source

 

 

1 ACCEPTED SOLUTION

Just copy and past the below code into the advance editor.

It is also more efficeint in term of time to calculate running total

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("Zc9BCsQgDAXQqxTXBU200W1bmOWsC8X7X6P5pjCSWRgUH9/vfYf9CGvgGilFTlz0UDYdaay+Kjg9YFxS1ZFN/EVsECzYmbi8yEOwDpkyZBYZXfAKsZHdE0JTgaO3ydeT0VWaFf59ZyYFAdUCX/PxBg81pCzUQu8P", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, Run_Date = _t, G = _t, O = _t, #"Sort F" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Type", type text}, {"Run_Date", type date}, {"G", Int64.Type}, {"O", Int64.Type}, {"Sort F", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Run_Date"}, {{"Added Index", each Table.AddIndexColumn(_,"in",1)}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Selected row", each [a=List.Buffer([Added Index][G]),b=List.Last(List.Generate(()=>1,each List.Sum(List.FirstN(a,_))<100, each _+1))+1][b]),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom", each Table.Range(Table.AddColumn(Table.AddColumn([Added Index], "Cumm G", (x)=> List.Sum(List.FirstN(_[Added Index][G],[Selected row]))),"Cumm O", (x)=> List.Sum(List.FirstN(_[Added Index][O],[Selected row]))),[Selected row],1)),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Added Index", "Selected row", "Run_Date"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Type", "Run_Date", "G", "O", "Sort F", "in", "Cumm G", "Cumm O"}, {"Type", "Run_Date", "G", "O", "Sort F", "in", "Cumm G", "Cumm O"})
in
    #"Expanded Custom"
If my answer helped solve your issue, please consider marking it as the accepted solution. It helps others in the community find answers faster—and keeps the community growing stronger!
You can also check out my YouTube channel for tutorials, tips, and real-world solutions in Power Query with the following link
https://youtube.com/@omidbi?si=96Bo-ZsSwOx0Z36h

View solution in original post

4 REPLIES 4
Omid_Motamedise
Super User
Super User

You can filter the value grather than or equal to 100 on the column G, then use grooup by for the Run_Date adn pick all rows as operation, then in the resulted formula in the third argument, instead of each _ use each Table.FirstN(_,1) so it results in the desigred table.


If you face a problem, please provide your data as table here

If my answer helped solve your issue, please consider marking it as the accepted solution. It helps others in the community find answers faster—and keeps the community growing stronger!
You can also check out my YouTube channel for tutorials, tips, and real-world solutions in Power Query with the following link
https://youtube.com/@omidbi?si=96Bo-ZsSwOx0Z36h

Omid, 

I can't filter the value >= 100 before I run the function. I need a modified function that calculate the running total (Cumm G) the then filter >=100 for each date.

this is the table:

 

TypeRun_DateGOSort F
AB27/10/20244500
AC27/10/202420173
AB27/10/202450265
AX27/10/202430226
AB26/10/2024334712
AA26/10/2024156313
AN26/10/2024206817
AC26/10/20244372417
AF26/10/2024588 18

Just copy and past the below code into the advance editor.

It is also more efficeint in term of time to calculate running total

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("Zc9BCsQgDAXQqxTXBU200W1bmOWsC8X7X6P5pjCSWRgUH9/vfYf9CGvgGilFTlz0UDYdaay+Kjg9YFxS1ZFN/EVsECzYmbi8yEOwDpkyZBYZXfAKsZHdE0JTgaO3ydeT0VWaFf59ZyYFAdUCX/PxBg81pCzUQu8P", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, Run_Date = _t, G = _t, O = _t, #"Sort F" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Type", type text}, {"Run_Date", type date}, {"G", Int64.Type}, {"O", Int64.Type}, {"Sort F", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Run_Date"}, {{"Added Index", each Table.AddIndexColumn(_,"in",1)}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Selected row", each [a=List.Buffer([Added Index][G]),b=List.Last(List.Generate(()=>1,each List.Sum(List.FirstN(a,_))<100, each _+1))+1][b]),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom", each Table.Range(Table.AddColumn(Table.AddColumn([Added Index], "Cumm G", (x)=> List.Sum(List.FirstN(_[Added Index][G],[Selected row]))),"Cumm O", (x)=> List.Sum(List.FirstN(_[Added Index][O],[Selected row]))),[Selected row],1)),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Added Index", "Selected row", "Run_Date"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Type", "Run_Date", "G", "O", "Sort F", "in", "Cumm G", "Cumm O"}, {"Type", "Run_Date", "G", "O", "Sort F", "in", "Cumm G", "Cumm O"})
in
    #"Expanded Custom"
If my answer helped solve your issue, please consider marking it as the accepted solution. It helps others in the community find answers faster—and keeps the community growing stronger!
You can also check out my YouTube channel for tutorials, tips, and real-world solutions in Power Query with the following link
https://youtube.com/@omidbi?si=96Bo-ZsSwOx0Z36h

Omid, Thank you for your support. The code is working fine but for small set of data. 

I have nearly +300K rows of data and it is taking very long time to run.

Is there a way that you can speed up the process of running total, i.e using the puffering?

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors