Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello everyone!
I have two tables.
At first, I have a series of dates, event codes and names that are repeated for periods.
For example:
01/01/2021 - Code A - Name A
01/01/2021 - Code A - Name B
01/01/2021 - Code A - Name C
01/02/2021 - Code B - Name A
01/02/2021 - Code B - Name B
01/02/2021 - Code B - Name C
In the second table, I have new entries with dates, codes and the names. In this table, the same name can be repeated more times for the same code but with different dates.
For Example:
01/03/2021 - Code A - Name A
01/04/2021 - Code A - Name A
01/03/2021 - Code A - Name B
01/04/2021 - Code A - Name C
01/04/2021 - Code B - Name A
01/05/2021 - Code B - Name B
01/07/2021 - Code B - Name B
What I need? 🤔
I need to merge these tables, bringing to the first table, the first date for the code and name combination of the second table.
For example:
01/01/2021 - Code A - Name A - 01/03/2021
01/01/2021 - Code A - Name B - 01/03/2021
01/01/2021 - Code A - Name C - 01/04/2021
01/02/2021 - Code B - Name A - 01/04/2021
01/02/2021 - Code B - Name B - 01/05/2021
01/02/2021 - Code B - Name C - null
All duplicates that exist in the second table should be ignored, as I only need to bring the first date (the smallest date) of that code + name for the reference of the first table.
If possible, I can count how many times this combination appears in the second table?
I tried to exclude duplicates, but it didn't work out, since there were still duplicate values due to different dates. 😞
Solved! Go to Solution.
another way, without group by, could be this:
let
Origine = Table.TransformColumns (Table.NestedJoin(T1,"codde-name",T2,"codde-name","tt",JoinKind.FullOuter),{"tt", (t)=>t{0}[date]}),
#"Tabella tt espansa" = Table.ExpandTableColumn(Origine, "tt", {"first"}, {"tt.first"})
in
#"Tabella tt espansa"
where T1 and T2 are:
let
Origine = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjDUByIjAyNDBV0F5/yUVAVHIMMvMRfIUIrVwavAiZACZ7gCIxQFTphW4FDgREgB0IpYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Colonna1 = _t]),
#"Modificato tipo" = Table.TransformColumnTypes(Origine,{{"Colonna1", type text}}),
#"Suddividi colonna in base al delimitatore" = Table.SplitColumn(#"Modificato tipo", "Colonna1", Splitter.SplitTextByDelimiter(" - ", QuoteStyle.Csv), {"Colonna1.1", "Colonna1.2", "Colonna1.3"}),
#"Modificato tipo1" = Table.TransformColumnTypes(#"Suddividi colonna in base al delimitatore",{{"Colonna1.1", type date}, {"Colonna1.2", type text}, {"Colonna1.3", type text}}),
#"Rinominate colonne" = Table.RenameColumns(#"Modificato tipo1",{{"Colonna1.1", "date"}, {"Colonna1.2", "code"}, {"Colonna1.3", "name"}}),
#"Aggiunta colonna personalizzata" = Table.AddColumn(#"Rinominate colonne", "date2", each List.Min(Table.SelectRows(Tabella2, (r)=>r[code]=[code] and r[name]=[name])[date]))
in
#"Aggiunta colonna personalizzata"
Apparently it worked, but it got really, really slow. I have a large volume of data, about 400 thousand lines. That's right?
another way, without group by, could be this:
let
Origine = Table.TransformColumns (Table.NestedJoin(T1,"codde-name",T2,"codde-name","tt",JoinKind.FullOuter),{"tt", (t)=>t{0}[date]}),
#"Tabella tt espansa" = Table.ExpandTableColumn(Origine, "tt", {"first"}, {"tt.first"})
in
#"Tabella tt espansa"
where T1 and T2 are:
well ... you didn't report this little detail. try like this:
group tab2 with respect to code-name and take the first element of each group
at this point you can do what the topic title says
@LeandroCampacci if you can do the transformation in DAX completely you will see a huge performance boost
Column = CALCULATE(MINX(FILTER(t2,t2[name]=MAX(t1[name])&&t2[code ]=MAX(t1[code ])),t2[date]))