Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
AGo
Post Patron
Post Patron

Conditional efficient rolling sum function

Hi!

I combined a lot of csvs now I have to calculate (in M language because I'll have to pivot a column next) a new column which increases by 1 after every time [Stop] value is 1. In Calc1 there's the expected result. I tried with = Table.AddColumn(Source, "Event Index", each List.Sum(List.Range(Source[Stop],0,[Index]))) but since it's a combination of csvs it takes too much time and crashes. With this function it seems that it operates like a crossjoin, it operates for every single file combination. So I need an alternative, maybe a function that verifies only the punctual previous row and adds 1 only if there's 1 in Stop column.

Note that there's not a fixed number of rows between the stop=1 and the previous one.

 

Thank you very much

 

StopCalc1
00
00
00
00
00
00
00
00
00
00
10
01
01
01
01
01
01
01
01
01
01
01
11
02
02
02
02
02
02
02
02
02
02
02
1 ACCEPTED SOLUTION

Hi @AGo 

please check out this solution (paste the code into the advanced editor and follow the steps):

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
    Custom1 = List.Buffer( #"Changed Type"[Stop] ),
    Calc1 = List.Generate( ()=>
        [Level = 0, Counter = 0],
        each [Counter] < List.Count(Custom1) ,
        each [
            Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
            Counter = [Counter] + 1
        ],
        each [Level] ),
    AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
    AutomaticConversionWithFullTable

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

View solution in original post

3 REPLIES 3
amitchandak
Super User
Super User

@ImkeF , please check if you can help.

Hi @AGo 

please check out this solution (paste the code into the advanced editor and follow the steps):

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
    Custom1 = List.Buffer( #"Changed Type"[Stop] ),
    Calc1 = List.Generate( ()=>
        [Level = 0, Counter = 0],
        each [Counter] < List.Count(Custom1) ,
        each [
            Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
            Counter = [Counter] + 1
        ],
        each [Level] ),
    AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
    AutomaticConversionWithFullTable

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Hi @ImkeF ,

at first your solution worked but caused the error "Expression.Error: Evaluation resulted in a stack overflow and cannot continue." right after using the pivoting function on another column of the same table (in don't aggregate mode).

Then I used your function in the example file transformation step before combining multiple files, I concatenated your result with the column with file name, so it worked.

I don't know why it returned that error, maybe this function is too heavy for massive calculation after files combining.

 

Thanks

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.