Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
EaglesTony
Post Prodigy
Post Prodigy

How can I group like rows ?

I have a table with the following:

 

NAME      KEY

Added      DATA-123

Removed  DATA-123

Committed DATA-456

Added       DATA-789

 

What I need is to know if the KEY has a Removed, to filter those out, thus the end I want is:

 

NAME            KEY

Committed    DATA-456

Added           DATA-789

 

I was thinking I could use a new table and merge some way with left joins, but is there a way to use the same table using some sort of group by function ?

 

 

Thanks

 

26 REPLIES 26
dufoq3
Super User
Super User

Hi @EaglesTony, you have many solutions here...
Another one:

 

Output:

dufoq3_0-1737823280583.png

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckxJSU1R0lFycQxx1DU0MlaK1YlWCkrNzS/DFHbOz83NLClBSJiYmoElUAwxt7BUio0FAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [NAME = _t, KEY = _t]),
    FilteredName = Table.RemoveColumns(Table.Join(Source, "KEY", Table.RenameColumns(Table.SelectRows(Source, each [NAME] = "Removed")[[KEY]], {{"KEY", "KEY2"}}), "KEY2", JoinKind.LeftAnti), {"KEY2"})
in
    FilteredName

Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

sevenhills
Super User
Super User

 

Assuming the source:

sevenhills_0-1737683896020.png



Power Query:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckxJSU1R0lFycQxx1DU0MlaK1YlWCkrNzS/DFHbOz83NLClBSJiYmoElUAwxt7BUio0FAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Key = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Key", type text}}),
    #"Trimmed Text" = Table.TransformColumns(#"Changed Type",{{"Name", Text.Trim, type text}, {"Key", Text.Trim, type text}}),
    #"Filtered Rows" = Table.SelectRows(#"Trimmed Text", each ([Name] = "Removed")),
    #"Merged Queries" = Table.NestedJoin(#"Trimmed Text", {"Key"}, #"Filtered Rows", {"Key"}, "Filtered Rows", JoinKind.LeftAnti),
    #"Removed Columns" = Table.RemoveColumns(#"Merged Queries",{"Filtered Rows"})
in
    #"Removed Columns"

 

 

or

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckxJSU1R0lFycQxx1DU0MlaK1YlWCkrNzS/DFHbOz83NLClBSJiYmoElUAwxt7BUio0FAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t, Key = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}, {"Key", type text}}),
    #"Trimmed Text" = Table.TransformColumns(#"Changed Type",{{"Name", Text.Trim, type text}, {"Key", Text.Trim, type text}}),
    #"Added Custom" = Table.AddColumn(#"Trimmed Text", "To Be Removed", each Table.MatchesAnyRows(Source, (R) => R[Key] = [Key] and Text.Lower(R[Name]) = "removed")),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([To Be Removed] = false)),
    #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"To Be Removed"})
in
    #"Removed Columns"

 

Output:

sevenhills_1-1737684013151.png

 

ronrsnfld
Super User
Super User

And another approach, which seems to execute quite rapidly:

 

Original

ronrsnfld_1-1737664592708.png

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckxJSU1R0lFycQxx1DU0MlaK1YlWCkrNzS/DFHbOz83NLClBSJiYmoElUAwxt7DEFDQ2MsRmMlg4FgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [NAME = _t, KEY = _t]),
    
    filter = Table.SelectRows(Source, (r)=>
        let 
            allKeys = Table.SelectRows(Source, each [KEY] = r[KEY])
        in 
            not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase))
in
    filter

 

 

Result

ronrsnfld_0-1737664539912.png

 

 

Not sure how to code this, as my last line is(I was trying to set some sort of flag and then filter out):

 

#"Added Custom4" = Table.AddColumn(#"Filtered Rows1", "FeatureHasRemovalOnIt", each if Text.Contains([NAME], "-Removed", Comparer.OrdinalIgnoreCase) then "Y" else "N")

With the code I provided there is no need for that step. The code will remove (filter out) the proper rows all in that single step. Perhaps if you pasted the original code into the Advanced Editor (replacing all that is there), you would understand better.

 

Also, eliminating the "create a new column" step increases the efficiency/speed of the process.

 

Perhaps:

  1. Duplicate your existing query
  2. Delete all the lines below where your full table appears.
  3. Then follow the instructions regarding inserting the line I Mentioned.

Can you explain from a manual part (i.e. selected Name column then group by) instead of the Advanced Editor View ?

The code does not use GroupBy (and executes considerably faster). If you wanted to do it from the UI,

 1. when your table is showing, select the down arrow in the NAME column, and de-select "Removed"

  

ronrsnfld_0-1737665389432.png

 2. In the formula bar you will see something similar to below. "Source" will be the same as your  previous step.

       

ronrsnfld_1-1737665469929.png

 

 3. Replace what you see after the first comma with:

 

(r)=>
        let 
            allKeys = Table.SelectRows(Source, each [KEY] = r[KEY])
        in 
            not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase))

 

resulting in:

ronrsnfld_2-1737665660855.png

 

 

 

 

 

 

When I filtered out "Removed", I have this line:

 

#"Filtered Rows2" = Table.SelectRows(#"Added Custom4", each ([NAME] = "Added"))

 

However, I still see "DATA-123" as Added in the table, the point is if DATA-123(or any other record) has both "Added" and "Removed" to remove both rows.


@EaglesTony wrote:

When I filtered out "Removed", I have this line:

 

#"Filtered Rows2" = Table.SelectRows(#"Added Custom4", each ([NAME] = "Added"))

 

However, I still see "DATA-123" as Added in the table, the point is if DATA-123(or any other record) has both "Added" and "Removed" to remove both rows.


That is because you did not follow my instructions precisely.

It seems you have filtered something before you even got to entering my line.

 

It's really hard to tell where you went wrong without knowing what you have done.

 

I suggest:

  1. Make a copy of your query as it exists.
  2. Navigate to the step where you see the table as you show in your question.
  3. Delete all of the steps below that particular step.
  4. Then add the Filtered rows step as I showed you and edit in the formula bar as I described.

Here is what I got:

 

let
Source = OData.Feed(source from another external system)
Custom1 = Source{[Name="Labels",Signature="table"]}[Data],
#"Filtered Rows" = Table.SelectRows(Custom1, each [NAME] = "Added" or [NAME] = "Committed"),
#"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each [ISSUE_KEY] = "DATA-123")
in
#"Filtered Rows1"

 

I added that last Filtered Row to see if DATA-123 was being excluded, but it isn't.

Assuming Custom1 step shows the table similar to what you show in your original question, had you done what I have suggested, you should have something like:

 

let
    Source = OData.Feed(your data source),
    Custom1 = Source{[Name="Labels",Signature="table"]}[Data],


        #"Filtered Rows" = Table.SelectRows(Custom1, (r)=>
        let 
            allKeys = Table.SelectRows(Custom1, each [KEY] = r[KEY])
        in 
            not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase))
in
    #"Filtered Rows"

 

which would show your stated desired result.

I have this, but it seems to take quite awhile to refresh:

 

= Table.SelectRows(Custom1, (r)=>
let
allKeys = Table.SelectRows(Custom1, each [KEY] = r[KEY])
in
not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase))

Seems similar to what you've posted before, and what I've responded to.

If the source of the slowdown is not clarified by my previous response, I don't think I'll be able to assist further without access to a sample database that illustrates the problem.

 

It could be your data source, the code in your own system, the size of the database, any number of things. But no way to troubleshoot without more inforrmation

That seems almost identical to what you wrote previously. I had given you a list of possibilities for slowness. I am afraid that without access to a data set example that can reproduce the problem, I don't know what to add about the slowness.

It is taking quite awhile to get records back.

 

Also, Name has other than Committed, Removed, Added. Some are just junk values. 

 

 

I don't know what you mean, nor how it relates to your question. You don't provide sufficient information for me to comment.

 

"It is taking quite awhile to get records back"

  •  From what step?
  • How long?
  • what are the errors if any? 
  • If your data source is slow, that would certainly be a cause.

 "Also, Name has other than Committed, Removed, Added. Some are just junk values".

  • How does that relate to your original question?
  • The code I provided does what you stated in your original question, which was to remove KEYs if there were any NAME="Removed" listed with any one of the KEYs.

What I did was make a duplicate of the table and filtered down to those records I don't want.

 

Then I merged the first table(all records) with the 2nd table(all "bad" records, those with removed) and did a left join.

 

Then I filtered the key from that merge step that is null(thus not on the 2nd table) and it now has my records I want. Not the most efficent, but it works.

From the process you describe, it sounds as if it would be much more efficient to just filter out (remove) the records you don't want, which is what my code would do. If there are other records you don't want, they can be added to that filter step.

 

That is not simple to do in the UI. You would need to go into the Advanced Editor (or the formula bar), and edit the M-code directly.

 

Just as an example of the outcomes of using the different approaches provided you in this thread, there was a 50 fold difference in time spent to execute the query comparing the most efficient (mine) to the least, using the data set you provided.

Ok, I will put it in again, but taking time to refresh this table, maybe the number of rows within it.

 

= Table.SelectRows(Custom1, (r)=>
let
allKeys = Table.SelectRows(Custom1, each [KEY] = r[KEY])
in
not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase))

 

Is there a way to add to remove items that have "Removed" and other junk values(i.e. Some have "Test", "Demo", etc) or maybe I can filter these out ahead of time before the above step.

 

It depends on exactly what you have and what you want to do.

 

If you are going to remove just the singular entries, then you could change the last line in that Filter function to something like:

 

 

not List.Contains(allKeys[NAME],"Removed",Comparer.OrdinalIgnoreCase)
            and r[NAME] <> "Demo" and r[NAME] <> "Test"

 

 

 

If you want to remove all the KEYS if any NAME is junk, or if your logic works better by selecting which you want to retain, the line would be different.

 

And there is other logic that can be used depending on the specifics of your problem.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors