Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
frozenovergen
Frequent Visitor

Help With Transforming Table to Show Unique Records

Hi,

 

I am trying to created a dashboard showing production losses. The data source is composed of rows containing the date, asset, loss type and comments. Comments are typically identical across multiple days of an outage. The data table looks something like this:

 

Date

Asset

Loss Type

ReasonComments

27/07/2024 00:00

Asset A

Downtime

xyz

27/07/2024 00:00

Asset B

Downtime

xyz

26/07/2024 00:00

Asset A

Downtime

xyz

26/07/2024 00:00

Asset B

Downtime

xyz

25/07/2024 00:00

Asset A

Downtime

abc

25/07/2024 00:00

Asset B

Downtime

abc

24/07/2024 00:00

Asset A

Downtime

abc

24/07/2024 00:00

Asset B

Downtime

abc

23/07/2024 00:00

Asset A

Downtime

xyz

23/07/2024 00:00

Asset B

Downtime

xyz

22/07/2024 00:00

Asset A

Downtime

xyz

22/07/2024 00:00

Asset B

Downtime

xyz

 

I would like to transform the table to only show unique comments and the date range over which they occurred whilst maintaining the date slicer I have on the dashboard. I'm hoping to end up with something like this:

 

Start Date

End Date

Asset

Loss Type

ReasonComments

26/07/2024 00:00

27/07/2024 00:00

Asset A

Downtime

xyz

26/07/2024 00:00

27/07/2024 00:00

Asset B

Downtime

xyz

14/07/2024 00:00

25/07/2024 00:00

Asset A

Downtime

abc

14/07/2024 00:00

25/07/2024 00:00

Asset B

Downtime

abc

23/07/2024 00:00

24/07/2024 00:00

Asset A

Downtime

def

23/07/2024 00:00

24/07/2024 00:00

Asset B

Downtime

def

 

How can I do this using DAX?

 

Any help would be greatly appreciated!

 

Thanks

1 ACCEPTED SOLUTION
SamWiseOwl
Super User
Super User

Hi @frozenovergen 

 

You could do this in the Query Editor if you wanted.
Create a copy of the full table.

Then another copy with the date removed and duplicates removed.

Join the two together using MERGE

SamWiseOwl_0-1736349723690.png

 

Then expand the data and choose AGGREGATE the Date Min and MAX

SamWiseOwl_1-1736349752356.png

 

Here is the M code for the full table

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}})
in
#"Changed Type1"

 

And here is the code for the aggregate table

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"Date"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns"),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Duplicates",{"Asset", "ReasonComments", "Loss Type"}),
#"Merged Queries" = Table.NestedJoin(#"Reordered Columns", {"Asset", "ReasonComments"}, #"Full Table", {"Asset", "ReasonComments"}, "Full Table", JoinKind.LeftOuter),
#"Aggregated Full Table" = Table.AggregateTableColumn(#"Merged Queries", "Full Table", {{"Date", List.Min, "Min of Date"}, {"Date", List.Max, "Max of Date"}})
in
#"Aggregated Full Table"

 

 

If you want to load the whole table in instead.

I don't think you need DAX, just drag the 3 categories in and use Min and MAX on the dates:

SamWiseOwl_2-1736349914881.png

 


If you are happy with this answer please mark as a solution for others to find !

Kudos are always appreciated! Check out our free Power BI video courses.

View solution in original post

2 REPLIES 2
SamWiseOwl
Super User
Super User

Hi @frozenovergen 

 

You could do this in the Query Editor if you wanted.
Create a copy of the full table.

Then another copy with the date removed and duplicates removed.

Join the two together using MERGE

SamWiseOwl_0-1736349723690.png

 

Then expand the data and choose AGGREGATE the Date Min and MAX

SamWiseOwl_1-1736349752356.png

 

Here is the M code for the full table

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}})
in
#"Changed Type1"

 

And here is the code for the aggregate table

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WckksSVXSUXIsLk4tAdI++cXFCiGVBSCxoNTE4vw85/zc3NS8kmKlWJ1oJSNzfQNzfSMDIxMFAwMrAwOYTgVHIMslvzyvJDMXpLWisgq/eicc6s1INB+XelzmmxJnfmJSMn71TjjUm5BoPi71uMw3JjF8cKnHFT5GJJqPSz0282MB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type datetime}, {"Asset", type text}, {"Loss Type", type text}, {"ReasonComments", type text}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"Date"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns"),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Duplicates",{"Asset", "ReasonComments", "Loss Type"}),
#"Merged Queries" = Table.NestedJoin(#"Reordered Columns", {"Asset", "ReasonComments"}, #"Full Table", {"Asset", "ReasonComments"}, "Full Table", JoinKind.LeftOuter),
#"Aggregated Full Table" = Table.AggregateTableColumn(#"Merged Queries", "Full Table", {{"Date", List.Min, "Min of Date"}, {"Date", List.Max, "Max of Date"}})
in
#"Aggregated Full Table"

 

 

If you want to load the whole table in instead.

I don't think you need DAX, just drag the 3 categories in and use Min and MAX on the dates:

SamWiseOwl_2-1736349914881.png

 


If you are happy with this answer please mark as a solution for others to find !

Kudos are always appreciated! Check out our free Power BI video courses.

Thanks! After a bit of fiddling it now works perfectly.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.