Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hi All
Is it possible to create a measure in PBI that remove duplicate values per group, and keeps the first occurrence of the value? Perhaps not a measure but maybe a new table, I'm unsure of how to work around this. For context, the database I'm working with has a funky treatment of some data that has a function back-end but doesn't make sense when trying to visualise user behaviour.
Currently, the data looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userA | 1 | unicorn | fail | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
I want to clean it so it looks like:
USER | ENTRY ID | ENTRY | STATUS | TIME SUBMITTED (HH:mm:ss) |
userA | 1 | unicorn | fail | 10:23:10 |
userA | 2 | forest | pass | 10:30:49 |
userB | 1 | unicorn | fail | 13:40:22 |
userB | 1 | fairy | pass | 13:43:59 |
... | ... | ... | ... | ... |
Note the row I want to remove has
Also, the ENTRY ID cannot be used.
Any pointers would be greatly appreciated 🙂
Solved! Go to Solution.
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Here's the M code that does what you want:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WKi1OLXJU0lEyBOLSvMzk/KI8ICstMTMHJGhgZWRsZWigFKuDUGkEks8vSi0uATIKEouLIQqNDaxMLFEU4jISVaUTTpXGViZA+40wVALliyqR7AaqM7YyBZoYCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, EntryID = _t, Entry = _t, Status = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"EntryID", Int64.Type}, {"Entry", type text}, {"Status", type text}, {"Time", type time}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "FirstTimeEntry",
each List.Min(
Table.SelectRows(
#"Changed Type",
(r) => r[User] = [User] and r[Entry] = [Entry]
)[Time]
) = [Time]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([FirstTimeEntry] = true)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"FirstTimeEntry"})
in
#"Removed Columns"
Best
D
Thanks @Anonymous, had to have a little play around and add in r[entry id] = [entry id] but it's working well now 👍 cheers again
Hi @Anonymous ,
Check if this existing thread helps:
Thanks,
Pragati
Thanks @Pragati11 , the buffer got me halfway there! Now it's just removing the wrong duplicate, hopefully the other reply will resolve this 🙂
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
33 | |
15 | |
14 | |
13 | |
9 |
User | Count |
---|---|
60 | |
23 | |
22 | |
19 | |
13 |