Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

Standardize (harmonize) field values (names for ex.) to avoid unnecessary row duplications

Hello, I would like to standardize some additional informations which I bring out from a crossjoin in POwerQ,  as an example take  NameMerge in the following table :

LookupKeyKeyMergeNameMergeQuantity
E2345E2345F_100Shampoo Delicate Skin 500 ml1 000
E2345E2345F_102Shampoo Delicate Skin 500ml500
E2456E2456T........

  My model is quite complex and I cannot benefit from a lookup Product Table  with standardize unique names, actually I came up with a solution by adding a calculated column with the following formula :

 

calculate(firstnonblank(NameMerge,0 ), filter(All(crossjoin), earlier(crossjoin[LookupKey])= crossjoin[LookupKey]))) 

 

and it's working I have now a new uniform name for every position but I am asking myself if it's the recommended way ? May be I should find a way to perform the standardization somehow in powerQuery? I don't think that creating a measure is possible here as I would like to have the names in the rows of the pivot table (I tried and it give me a cartesian products of names).

May be someone come out with another solution?

 

IMD

 

1 ACCEPTED SOLUTION
AlB
Community Champion
Community Champion

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Thanks very much for your prompt response, I accept your solution - I didn't precise that I need also to preserve the KeyMerge information so I just add some "loobback" merging

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge","NameMerge", "Quantity"}, {"KeyMerge","NameMerge", "Quantity"}),
    #"Merged Queries" = Table.NestedJoin(#"Expanded Res", {"LookupKey"}, #"Changed Type", {"LookupKey"}, "Expanded Res", JoinKind.LeftOuter),
    #"Expanded Expanded Res" = Table.ExpandTableColumn(#"Merged Queries", "Expanded Res", {"Quantity"}, {"Expanded Res.Quantity"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Expanded Res",{"Quantity"})
in
    #"Removed Columns"

AIB, for sure I will have your contact in mind in case of some difficulties with setting up models.

 

AIB

AlB
Community Champion
Community Champion

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.