Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
I am trying to created a dashboard showing production losses. The data source is composed of rows containing the date, asset, loss type and comments. Comments are typically identical across multiple days of an outage. The data table looks something like this:
Date | Asset | Loss Type | ReasonComments |
27/07/2024 00:00 | Asset A | Downtime | xyz |
27/07/2024 00:00 | Asset B | Downtime | xyz |
26/07/2024 00:00 | Asset A | Downtime | xyz |
26/07/2024 00:00 | Asset B | Downtime | xyz |
25/07/2024 00:00 | Asset A | Downtime | abc |
25/07/2024 00:00 | Asset B | Downtime | abc |
24/07/2024 00:00 | Asset A | Downtime | abc |
24/07/2024 00:00 | Asset B | Downtime | abc |
23/07/2024 00:00 | Asset A | Downtime | xyz |
23/07/2024 00:00 | Asset B | Downtime | xyz |
22/07/2024 00:00 | Asset A | Downtime | xyz |
22/07/2024 00:00 | Asset B | Downtime | xyz |
I would like to transform the table to only show unique comments and the date range over which they occurred whilst maintaining the date slicer I have on the dashboard. I'm hoping to end up with something like this:
Start Date | End Date | Asset | Loss Type | ReasonComments |
26/07/2024 00:00 | 27/07/2024 00:00 | Asset A | Downtime | xyz |
26/07/2024 00:00 | 27/07/2024 00:00 | Asset B | Downtime | xyz |
14/07/2024 00:00 | 25/07/2024 00:00 | Asset A | Downtime | abc |
14/07/2024 00:00 | 25/07/2024 00:00 | Asset B | Downtime | abc |
23/07/2024 00:00 | 24/07/2024 00:00 | Asset A | Downtime | def |
23/07/2024 00:00 | 24/07/2024 00:00 | Asset B | Downtime | def |
How can I do this using DAX?
Any help would be greatly appreciated!
Thanks
Solved! Go to Solution.
You could do this in the Query Editor if you wanted.
Create a copy of the full table.
Then another copy with the date removed and duplicates removed.
Join the two together using MERGE
Then expand the data and choose AGGREGATE the Date Min and MAX
Here is the M code for the full table
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}})
in
#"Changed Type1"
And here is the code for the aggregate table
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"Date"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns"),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Duplicates",{"Asset", "ReasonComments", "Loss Type"}),
#"Merged Queries" = Table.NestedJoin(#"Reordered Columns", {"Asset", "ReasonComments"}, #"Full Table", {"Asset", "ReasonComments"}, "Full Table", JoinKind.LeftOuter),
#"Aggregated Full Table" = Table.AggregateTableColumn(#"Merged Queries", "Full Table", {{"Date", List.Min, "Min of Date"}, {"Date", List.Max, "Max of Date"}})
in
#"Aggregated Full Table"
If you want to load the whole table in instead.
I don't think you need DAX, just drag the 3 categories in and use Min and MAX on the dates:
If you are happy with this answer please mark as a solution for others to find !
Kudos are always appreciated! Check out our free Power BI video courses.
You could do this in the Query Editor if you wanted.
Create a copy of the full table.
Then another copy with the date removed and duplicates removed.
Join the two together using MERGE
Then expand the data and choose AGGREGATE the Date Min and MAX
Here is the M code for the full table
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}})
in
#"Changed Type1"
And here is the code for the aggregate table
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"Date"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns"),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Duplicates",{"Asset", "ReasonComments", "Loss Type"}),
#"Merged Queries" = Table.NestedJoin(#"Reordered Columns", {"Asset", "ReasonComments"}, #"Full Table", {"Asset", "ReasonComments"}, "Full Table", JoinKind.LeftOuter),
#"Aggregated Full Table" = Table.AggregateTableColumn(#"Merged Queries", "Full Table", {{"Date", List.Min, "Min of Date"}, {"Date", List.Max, "Max of Date"}})
in
#"Aggregated Full Table"
If you want to load the whole table in instead.
I don't think you need DAX, just drag the 3 categories in and use Min and MAX on the dates:
If you are happy with this answer please mark as a solution for others to find !
Kudos are always appreciated! Check out our free Power BI video courses.
Thanks! After a bit of fiddling it now works perfectly.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 39 | |
| 39 | |
| 37 | |
| 29 | |
| 24 |
| User | Count |
|---|---|
| 120 | |
| 95 | |
| 70 | |
| 69 | |
| 65 |