This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreDid you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now
Hi!
I combined a lot of csvs now I have to calculate (in M language because I'll have to pivot a column next) a new column which increases by 1 after every time [Stop] value is 1. In Calc1 there's the expected result. I tried with = Table.AddColumn(Source, "Event Index", each List.Sum(List.Range(Source[Stop],0,[Index]))) but since it's a combination of csvs it takes too much time and crashes. With this function it seems that it operates like a crossjoin, it operates for every single file combination. So I need an alternative, maybe a function that verifies only the punctual previous row and adds 1 only if there's 1 in Stop column.
Note that there's not a fixed number of rows between the stop=1 and the previous one.
Thank you very much
| Stop | Calc1 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 1 | 0 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 1 | 1 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
Solved! Go to Solution.
Hi @AGo
please check out this solution (paste the code into the advanced editor and follow the steps):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
Custom1 = List.Buffer( #"Changed Type"[Stop] ),
Calc1 = List.Generate( ()=>
[Level = 0, Counter = 0],
each [Counter] < List.Count(Custom1) ,
each [
Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
Counter = [Counter] + 1
],
each [Level] ),
AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
AutomaticConversionWithFullTable
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @AGo
please check out this solution (paste the code into the advanced editor and follow the steps):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
Custom1 = List.Buffer( #"Changed Type"[Stop] ),
Calc1 = List.Generate( ()=>
[Level = 0, Counter = 0],
each [Counter] < List.Count(Custom1) ,
each [
Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
Counter = [Counter] + 1
],
each [Level] ),
AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
AutomaticConversionWithFullTable
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @ImkeF ,
at first your solution worked but caused the error "Expression.Error: Evaluation resulted in a stack overflow and cannot continue." right after using the pivoting function on another column of the same table (in don't aggregate mode).
Then I used your function in the example file transformation step before combining multiple files, I concatenated your result with the column with file name, so it worked.
I don't know why it returned that error, maybe this function is too heavy for massive calculation after files combining.
Thanks
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 37 | |
| 37 | |
| 31 | |
| 21 | |
| 16 |
| User | Count |
|---|---|
| 66 | |
| 56 | |
| 31 | |
| 26 | |
| 23 |