Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello,
I have duplicates in my data that are similar but not exact dupes (ie John_Smith and John_Smi). How am I able to consolidate these two but keep the ones that will be changing as their own distinct rows, just under the new name?
Solved! Go to Solution.
Hi @kirabray ,
Here I create a sample to have a test. I suggest you to try fuzzy merge and then do some transformations in Power Query Editor.
My Sample:
M Query:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyIsPzs0syVCK1UFwwZygzOT8+KiM/FIUqRJkOSS2UmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}}),
#"Merged Queries" = Table.FuzzyNestedJoin(#"Changed Type", {"Name"}, #"Changed Type", {"Name"}, "Changed Type", JoinKind.LeftOuter, [IgnoreCase=true, IgnoreSpace=true, Threshold=0.1]),
#"Expanded Changed Type" = Table.ExpandTableColumn(#"Merged Queries", "Changed Type", {"Name"}, {"Changed Type.Name"}),
#"Added Custom" = Table.AddColumn(#"Expanded Changed Type", "Custom", each Text.Length([Changed Type.Name])),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each List.Max(
let _Name = [Name] in
Table.SelectRows(#"Added Custom",each _Name = [Name])[Custom])),
#"Removed Duplicates" = Table.Distinct(#"Added Custom1", {"Custom", "Custom.1", "Changed Type.Name"}),
#"Filtered Rows" = Table.SelectRows(#"Removed Duplicates", each ([Custom] = [Custom.1])),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Changed Type.Name", "Custom", "Custom.1"})
in
#"Removed Columns"
Result is as below.
Best Regards,
Rico Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @kirabray ,
Here I create a sample to have a test. I suggest you to try fuzzy merge and then do some transformations in Power Query Editor.
My Sample:
M Query:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyIsPzs0syVCK1UFwwZygzOT8+KiM/FIUqRJkOSS2UmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Name = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}}),
#"Merged Queries" = Table.FuzzyNestedJoin(#"Changed Type", {"Name"}, #"Changed Type", {"Name"}, "Changed Type", JoinKind.LeftOuter, [IgnoreCase=true, IgnoreSpace=true, Threshold=0.1]),
#"Expanded Changed Type" = Table.ExpandTableColumn(#"Merged Queries", "Changed Type", {"Name"}, {"Changed Type.Name"}),
#"Added Custom" = Table.AddColumn(#"Expanded Changed Type", "Custom", each Text.Length([Changed Type.Name])),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each List.Max(
let _Name = [Name] in
Table.SelectRows(#"Added Custom",each _Name = [Name])[Custom])),
#"Removed Duplicates" = Table.Distinct(#"Added Custom1", {"Custom", "Custom.1", "Changed Type.Name"}),
#"Filtered Rows" = Table.SelectRows(#"Removed Duplicates", each ([Custom] = [Custom.1])),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Changed Type.Name", "Custom", "Custom.1"})
in
#"Removed Columns"
Result is as below.
Best Regards,
Rico Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thank you for this solution! I will try it out.
Hi @kirabray ,
Could you tell me if your problem has been solved? If it is, kindly Accept it as the solution. More people will benefit from it. Or you are still confused about it, please provide me with more details about your table and your issue or share me with your pbix file without sensitive data.
Best Regards,
Rico Zhou
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!