March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi All
Is it possible to create a measure in PBI that remove duplicate values per group, and keeps the first occurrence of the value? Perhaps not a measure but maybe a new table, I'm unsure of how to work around this. For context, the database I'm working with has a funky treatment of some data that has a function back-end but doesn't make sense when trying to visualise user behaviour.
Currently, the data looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userA | 1 | unicorn | fail | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
I want to clean it so it looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
... | ... | ... | ... | ... |
Note the row I want to remove has
Also, the ENTRY ID cannot be used.
Any pointers would be greatly appreciated 🙂
Solved! Go to Solution.
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Thanks @Anonymous, had to have a little play around and add in r[entry id] = [entry id] but it's working well now 👍 cheers again
Hi @Anonymous ,
Check if this existing thread helps:
Thanks,
Pragati
Thanks @Pragati11 , the buffer got me halfway there! Now it's just removing the wrong duplicate, hopefully the other reply will resolve this 🙂
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
User | Count |
---|---|
26 | |
21 | |
20 | |
13 | |
12 |
User | Count |
---|---|
40 | |
27 | |
27 | |
21 | |
19 |