Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
Anonymous
Not applicable

Standardize (harmonize) field values (names for ex.) to avoid unnecessary row duplications

Hello, I would like to standardize some additional informations which I bring out from a crossjoin in POwerQ,  as an example take  NameMerge in the following table :

LookupKeyKeyMergeNameMergeQuantity
E2345E2345F_100Shampoo Delicate Skin 500 ml1 000
E2345E2345F_102Shampoo Delicate Skin 500ml500
E2456E2456T........

  My model is quite complex and I cannot benefit from a lookup Product Table  with standardize unique names, actually I came up with a solution by adding a calculated column with the following formula :

 

calculate(firstnonblank(NameMerge,0 ), filter(All(crossjoin), earlier(crossjoin[LookupKey])= crossjoin[LookupKey]))) 

 

and it's working I have now a new uniform name for every position but I am asking myself if it's the recommended way ? May be I should find a way to perform the standardization somehow in powerQuery? I don't think that creating a measure is possible here as I would like to have the names in the rows of the pivot table (I tried and it give me a cartesian products of names).

May be someone come out with another solution?

 

IMD

 

1 ACCEPTED SOLUTION
AlB
Community Champion
Community Champion

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Thanks very much for your prompt response, I accept your solution - I didn't precise that I need also to preserve the KeyMerge information so I just add some "loobback" merging

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge","NameMerge", "Quantity"}, {"KeyMerge","NameMerge", "Quantity"}),
    #"Merged Queries" = Table.NestedJoin(#"Expanded Res", {"LookupKey"}, #"Changed Type", {"LookupKey"}, "Expanded Res", JoinKind.LeftOuter),
    #"Expanded Expanded Res" = Table.ExpandTableColumn(#"Merged Queries", "Expanded Res", {"Quantity"}, {"Expanded Res.Quantity"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Expanded Res",{"Quantity"})
in
    #"Removed Columns"

AIB, for sure I will have your contact in mind in case of some difficulties with setting up models.

 

AIB

AlB
Community Champion
Community Champion

Hi @Anonymous 

It would be better to do it in PQ, have that table clean already when it comes into the model. You could something simple like this. Place the following M code in a blank query to see the steps of an example:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcjUyNjFV0oHQbvGGBgZATnBGYm5Bfr6CS2pOZnJiSapCcHZmnoKpgYFCbg5QGqjIQClWB4tmI3yawXpNcWk1waXVCKteM6heM6BeYySt2B2JrBrJjYSNRgoNY7hiI1Og4lgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [LookupKey = _t, KeyMerge = _t, NameMerge = _t, Quantity = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"LookupKey", type text}, {"KeyMerge", type text}, {"NameMerge", type text}, {"Quantity", Int64.Type}}),

    #"Grouped Rows" = Table.Group(#"Changed Type", {"LookupKey"}, {{"Res", each Table.Min(_, "NameMerge")}}),
    #"Expanded Res" = Table.ExpandRecordColumn(#"Grouped Rows", "Res", {"KeyMerge", "NameMerge", "Quantity"}, {"KeyMerge", "NameMerge", "Quantity"})
in
    #"Expanded Res"

 

Please mark the question solved when done and consider giving a thumbs up if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.