Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi!
I have a test table, the first column has document id that it duplicates and the same document can go throught several diffrent operations. I would like to see how many diffrent operations the same document goes thrught and still keep the time for each operation. How can i achive this?
Thank you for your help!
Solved! Go to Solution.
@Just2431 If you had posted your sample data as text, I could have given you a working PBIX file. So, instead, I'll try to explain the process. In PQ Editor, do a Group By operation and use Doc id as the grouping. Use Advanced. Add a Count aggregation as well as an aggregation that keeps all rows. Now, expand the all rows column for your operation_name and time. Now you will have a new column that provides the count of operations for each doc id on each row and also preserve the rest of your data.
Hi @Just2431, try this:
Result
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCnAPV9JR8snMy87MSweyDJVidWCiSfkVEEFjJMHkotTEklQgwwRZMDEnuRhImyKJpeaVpBZB9BuBhb19ojyw2AUVhluGohhuG4pamHXGyIJI9qEoTiwoKMovS8yBq3d0cgayCxPhCiECaA6ACMJsMtIDei0WAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Doc id" = _t, operation_name = _t, time = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"time", type number}}, "en-US"),
GroupedRows = Table.Group(ChangedType, {"Doc id"}, {{"All", each _, type table [Doc id=nullable text, operation_name=nullable text, time=nullable number]}, {"Operations", each Table.RowCount(_), Int64.Type}}),
ExpandedAll = Table.ExpandTableColumn(GroupedRows, "All", {"operation_name", "time"}, {"operation_name", "time"})
in
ExpandedAll
Hi @Just2431, try this:
Result
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCnAPV9JR8snMy87MSweyDJVidWCiSfkVEEFjJMHkotTEklQgwwRZMDEnuRhImyKJpeaVpBZB9BuBhb19ojyw2AUVhluGohhuG4pamHXGyIJI9qEoTiwoKMovS8yBq3d0cgayCxPhCiECaA6ACMJsMtIDei0WAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Doc id" = _t, operation_name = _t, time = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"time", type number}}, "en-US"),
GroupedRows = Table.Group(ChangedType, {"Doc id"}, {{"All", each _, type table [Doc id=nullable text, operation_name=nullable text, time=nullable number]}, {"Operations", each Table.RowCount(_), Int64.Type}}),
ExpandedAll = Table.ExpandTableColumn(GroupedRows, "All", {"operation_name", "time"}, {"operation_name", "time"})
in
ExpandedAll
@Just2431 If you had posted your sample data as text, I could have given you a working PBIX file. So, instead, I'll try to explain the process. In PQ Editor, do a Group By operation and use Doc id as the grouping. Use Advanced. Add a Count aggregation as well as an aggregation that keeps all rows. Now, expand the all rows column for your operation_name and time. Now you will have a new column that provides the count of operations for each doc id on each row and also preserve the rest of your data.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 5 | |
| 4 | |
| 3 | |
| 3 | |
| 2 |
| User | Count |
|---|---|
| 9 | |
| 8 | |
| 7 | |
| 6 | |
| 5 |