Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi!
I combined a lot of csvs now I have to calculate (in M language because I'll have to pivot a column next) a new column which increases by 1 after every time [Stop] value is 1. In Calc1 there's the expected result. I tried with = Table.AddColumn(Source, "Event Index", each List.Sum(List.Range(Source[Stop],0,[Index]))) but since it's a combination of csvs it takes too much time and crashes. With this function it seems that it operates like a crossjoin, it operates for every single file combination. So I need an alternative, maybe a function that verifies only the punctual previous row and adds 1 only if there's 1 in Stop column.
Note that there's not a fixed number of rows between the stop=1 and the previous one.
Thank you very much
| Stop | Calc1 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 0 | 0 |
| 1 | 0 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 0 | 1 |
| 1 | 1 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
| 0 | 2 |
Solved! Go to Solution.
Hi @AGo
please check out this solution (paste the code into the advanced editor and follow the steps):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
Custom1 = List.Buffer( #"Changed Type"[Stop] ),
Calc1 = List.Generate( ()=>
[Level = 0, Counter = 0],
each [Counter] < List.Count(Custom1) ,
each [
Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
Counter = [Counter] + 1
],
each [Level] ),
AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
AutomaticConversionWithFullTable
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @AGo
please check out this solution (paste the code into the advanced editor and follow the steps):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1SGXNKRAL9XNiQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Stop = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Stop", Int64.Type}}),
Custom1 = List.Buffer( #"Changed Type"[Stop] ),
Calc1 = List.Generate( ()=>
[Level = 0, Counter = 0],
each [Counter] < List.Count(Custom1) ,
each [
Level = if Custom1{[Counter]} = 1 then [Level] + 1 else [Level],
Counter = [Counter] + 1
],
each [Level] ),
AutomaticConversionWithFullTable = Table.FromColumns( Table.ToColumns(#"Changed Type") & {#"Calc1"}, Table.ColumnNames(#"Changed Type") & {"Calc1"} )
in
AutomaticConversionWithFullTable
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @ImkeF ,
at first your solution worked but caused the error "Expression.Error: Evaluation resulted in a stack overflow and cannot continue." right after using the pivoting function on another column of the same table (in don't aggregate mode).
Then I used your function in the example file transformation step before combining multiple files, I concatenated your result with the column with file name, so it worked.
I don't know why it returned that error, maybe this function is too heavy for massive calculation after files combining.
Thanks
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 55 | |
| 34 | |
| 31 | |
| 19 | |
| 17 |
| User | Count |
|---|---|
| 74 | |
| 71 | |
| 38 | |
| 35 | |
| 25 |