Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

How to remove duplicate rows based on condition

Hi all,

 

I have a employee history table. I noticed some duplicated rows. How can I remove the duplicated row, based on the condition:

1. The same Employee Number

2.  Chg Rsn="901"?  (901 means new hired employee, new hire must be unique)

 

Thank you in advance!

Bei

 

 

babyjb_0-1684958533782.png

 

1 ACCEPTED SOLUTION
ValtteriN
Super User
Super User

Hi,

Here is one way to do this:

Example (we will remove one of the rows in yellow):

ValtteriN_0-1684987800423.png

Here is the PQ used:

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQyVtJRsjQwVIrVweSZgHlGWHmmWHhAfbEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Empid = _t, code = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Empid", Int64.Type}, {"code", Int64.Type}}),
#"Filtered Rows1" = Table.SelectRows(#"Changed Type", each [code] <> 901), //this table contains non 901 rows
#"Filtered Rows" = Table.SelectRows(#"Changed Type", each [code] = 901),
#"Removed Duplicates" = Table.Distinct(#"Filtered Rows", {"Empid"}), //this table contains non unique 901 rows
#"Appended Query" = Table.Combine({#"Removed Duplicates", #"Filtered Rows1"}) //here we combine the two to get the desired result
in
#"Appended Query"

End result:

ValtteriN_1-1684989220792.png

 

As we can see the non-desired row is now removed.

I hope this post helps to solve your issue and if it does consider accepting it as a solution and giving the post a thumbs up!

My LinkedIn: https://www.linkedin.com/in/n%C3%A4ttiahov-00001/





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




View solution in original post

3 REPLIES 3
ValtteriN
Super User
Super User

Hi,

Here is one way to do this:

Example (we will remove one of the rows in yellow):

ValtteriN_0-1684987800423.png

Here is the PQ used:

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjQyVtJRsjQwVIrVweSZgHlGWHmmWHhAfbEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Empid = _t, code = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Empid", Int64.Type}, {"code", Int64.Type}}),
#"Filtered Rows1" = Table.SelectRows(#"Changed Type", each [code] <> 901), //this table contains non 901 rows
#"Filtered Rows" = Table.SelectRows(#"Changed Type", each [code] = 901),
#"Removed Duplicates" = Table.Distinct(#"Filtered Rows", {"Empid"}), //this table contains non unique 901 rows
#"Appended Query" = Table.Combine({#"Removed Duplicates", #"Filtered Rows1"}) //here we combine the two to get the desired result
in
#"Appended Query"

End result:

ValtteriN_1-1684989220792.png

 

As we can see the non-desired row is now removed.

I hope this post helps to solve your issue and if it does consider accepting it as a solution and giving the post a thumbs up!

My LinkedIn: https://www.linkedin.com/in/n%C3%A4ttiahov-00001/





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Anonymous
Not applicable

This solution is genius! How did you come up with it?

Hi,

@Anonymous 
I created a flowchart with the categories required. In the end the problem is that we are trying to eliminate rows with certain conditions. So reversely, if we include all but the rows we want to eliminate we get the desired outcome.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors