Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
Anonymous
Not applicable

Remove initial zero values in a column

ranjithnk_0-1677943559688.png

I have a table with values starting from zero and then reaching a non-zero value. I want to remove only the initial rows with zero values.

Have tried a lot of solutions with List.Positionof but it keeps returning -1 value

 

1 ACCEPTED SOLUTION
ppm1
Solution Sage
Solution Sage

Here's one way to do it in the query editor with GroupKind.Local.  To see how it works, just create a blank query, open the Advanced Editor and replace the text there with the M code below.

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUUpUitXBzjKEs4yArCQwyxjOMsDBSlaKjQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Column1"}, {{"AllRows", each _, type table}}, GroupKind.Local),
    #"Removed Top Rows" = Table.Skip(#"Grouped Rows",1),
    #"Expanded AllRows" = Table.ExpandTableColumn(#"Removed Top Rows", "AllRows", {"Column2"}, {"Column2"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded AllRows",{{"Column2", type text}})
in
    #"Changed Type1"

 

 

Or, here is how to do it with List.PositionOf

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUUpUitUhxDICspLALGM4ywAHK1kpNhYA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}}),
    Custom1 = List.PositionOf(#"Changed Type"[Column1], 1, Occurrence.First, (x,y)=> if x<>0 then true else false),
    Custom2 = Table.Skip(#"Changed Type", Custom1)
in
    Custom2

 

 

If you try both and they both work, please report back which is more performant.

 

Pat

Microsoft Employee

View solution in original post

4 REPLIES 4
ppm1
Solution Sage
Solution Sage

Here's one way to do it in the query editor with GroupKind.Local.  To see how it works, just create a blank query, open the Advanced Editor and replace the text there with the M code below.

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUUpUitXBzjKEs4yArCQwyxjOMsDBSlaKjQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Column1"}, {{"AllRows", each _, type table}}, GroupKind.Local),
    #"Removed Top Rows" = Table.Skip(#"Grouped Rows",1),
    #"Expanded AllRows" = Table.ExpandTableColumn(#"Removed Top Rows", "AllRows", {"Column2"}, {"Column2"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded AllRows",{{"Column2", type text}})
in
    #"Changed Type1"

 

 

Or, here is how to do it with List.PositionOf

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUUpUitUhxDICspLALGM4ywAHK1kpNhYA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}}),
    Custom1 = List.PositionOf(#"Changed Type"[Column1], 1, Occurrence.First, (x,y)=> if x<>0 then true else false),
    Custom2 = Table.Skip(#"Changed Type", Custom1)
in
    Custom2

 

 

If you try both and they both work, please report back which is more performant.

 

Pat

Microsoft Employee
Anonymous
Not applicable

Thank you very much @ppm1 . I have used the option 2 for my requirements and it works very well.

Glad to hear it. Thanks for letting me know. Please mark it as the solution @Anonymous.

 

Pat

Microsoft Employee
Anonymous
Not applicable

Hi ppm1, 

 

To the second solution you proposed, if the input file is empty it throws an error.

My source is a folder that has multiple CSV files, and the query uses your script in a transform  query. Can you tell me how to enhance this logic to skip the file if its empty or there are no rows in the input file.

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.