Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi!
I have a test table, the first column has document id that it duplicates and the same document can go throught several diffrent operations. I would like to see how many diffrent operations the same document goes thrught and still keep the time for each operation. How can i achive this?
Thank you for your help!
Solved! Go to Solution.
@Just2431 If you had posted your sample data as text, I could have given you a working PBIX file. So, instead, I'll try to explain the process. In PQ Editor, do a Group By operation and use Doc id as the grouping. Use Advanced. Add a Count aggregation as well as an aggregation that keeps all rows. Now, expand the all rows column for your operation_name and time. Now you will have a new column that provides the count of operations for each doc id on each row and also preserve the rest of your data.
Hi @Just2431, try this:
Result
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCnAPV9JR8snMy87MSweyDJVidWCiSfkVEEFjJMHkotTEklQgwwRZMDEnuRhImyKJpeaVpBZB9BuBhb19ojyw2AUVhluGohhuG4pamHXGyIJI9qEoTiwoKMovS8yBq3d0cgayCxPhCiECaA6ACMJsMtIDei0WAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Doc id" = _t, operation_name = _t, time = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"time", type number}}, "en-US"),
GroupedRows = Table.Group(ChangedType, {"Doc id"}, {{"All", each _, type table [Doc id=nullable text, operation_name=nullable text, time=nullable number]}, {"Operations", each Table.RowCount(_), Int64.Type}}),
ExpandedAll = Table.ExpandTableColumn(GroupedRows, "All", {"operation_name", "time"}, {"operation_name", "time"})
in
ExpandedAll
Hi @Just2431, try this:
Result
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCnAPV9JR8snMy87MSweyDJVidWCiSfkVEEFjJMHkotTEklQgwwRZMDEnuRhImyKJpeaVpBZB9BuBhb19ojyw2AUVhluGohhuG4pamHXGyIJI9qEoTiwoKMovS8yBq3d0cgayCxPhCiECaA6ACMJsMtIDei0WAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Doc id" = _t, operation_name = _t, time = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"time", type number}}, "en-US"),
GroupedRows = Table.Group(ChangedType, {"Doc id"}, {{"All", each _, type table [Doc id=nullable text, operation_name=nullable text, time=nullable number]}, {"Operations", each Table.RowCount(_), Int64.Type}}),
ExpandedAll = Table.ExpandTableColumn(GroupedRows, "All", {"operation_name", "time"}, {"operation_name", "time"})
in
ExpandedAll
@Just2431 If you had posted your sample data as text, I could have given you a working PBIX file. So, instead, I'll try to explain the process. In PQ Editor, do a Group By operation and use Doc id as the grouping. Use Advanced. Add a Count aggregation as well as an aggregation that keeps all rows. Now, expand the all rows column for your operation_name and time. Now you will have a new column that provides the count of operations for each doc id on each row and also preserve the rest of your data.
User | Count |
---|---|
9 | |
7 | |
5 | |
5 | |
4 |
User | Count |
---|---|
14 | |
13 | |
8 | |
6 | |
6 |