Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi guys,
I have a table that contains User and Date columns for example, and the dates can be null because I am doing a left join for users table.
I would like that when the user doesn't have a date, replace null for all existing values from another user.
How do I do this? is it in power query or dax?
Original table
| User | Date |
| A | 01/01/2021 |
| A | 02/01/2021 |
| C | 03/01/2021 |
| B | null |
Expected table
| User | Date |
| A | 01/01/2021 |
| A | 02/01/2021 |
| C | 03/01/2021 |
| B | 01/01/2021 |
| B | 02/01/2021 |
| B | 03/01/2021 |
Solved! Go to Solution.
Try this in a new table:
New Table =
VAR _Dates =
DISTINCT ( OriginalTable[Date] )
VAR _BlankVal =
CALCULATETABLE (
VALUES ( OriginalTable[User] ),
FILTER ( OriginalTable, ISBLANK ( OriginalTable[Date] ) )
)
VAR _AssignDates =
FILTER ( CROSSJOIN ( _BlankVal, _Dates ), NOT ( ISBLANK ( [Date] ) ) )
VAR _OrigOK =
FILTER ( OriginalTable, NOT ( ISBLANK ( OriginalTable[Date] ) ) )
RETURN
UNION ( _OrigOK, _AssignDates )
Or this in Power Query (you will need to use your source table code for Source and Source1 lines)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjQ6VYHaiQEYqQM0jIGEXICSikFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, Date = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"Date", type date}}),
#"GoodRows" = Table.SelectRows(#"Changed Type", each ([Date] <> null)),
Source1 = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjQ6VYHaiQEYqQM0jIGEXICSikFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, Date = _t]),
#"Changed Type1" = Table.TransformColumnTypes(Source1,{{"User", type text}, {"Date", type date}}),
#"Filtered Rows" = Table.SelectRows(#"Changed Type1", each ([Date] = null)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Date"}),
#"Added Custom" = Table.AddColumn(#"Removed Columns", "Custom", each #"GoodRows"),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Date"}, {"Custom.Date"}),
#"Renamed Columns" = Table.RenameColumns(#"Expanded Custom",{{"Custom.Date", "Date"}}),
#"Appended Query" = Table.Combine({#"GoodRows", #"Renamed Columns"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Appended Query",{{"Date", type date}})
in
#"Changed Type2"
Proud to be a Super User!
Paul on Linkedin.
Try this in a new table:
New Table =
VAR _Dates =
DISTINCT ( OriginalTable[Date] )
VAR _BlankVal =
CALCULATETABLE (
VALUES ( OriginalTable[User] ),
FILTER ( OriginalTable, ISBLANK ( OriginalTable[Date] ) )
)
VAR _AssignDates =
FILTER ( CROSSJOIN ( _BlankVal, _Dates ), NOT ( ISBLANK ( [Date] ) ) )
VAR _OrigOK =
FILTER ( OriginalTable, NOT ( ISBLANK ( OriginalTable[Date] ) ) )
RETURN
UNION ( _OrigOK, _AssignDates )
Or this in Power Query (you will need to use your source table code for Source and Source1 lines)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjQ6VYHaiQEYqQM0jIGEXICSikFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, Date = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"User", type text}, {"Date", type date}}),
#"GoodRows" = Table.SelectRows(#"Changed Type", each ([Date] <> null)),
Source1 = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTIw1AciIwMjQ6VYHaiQEYqQM0jIGEXICSikFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [User = _t, Date = _t]),
#"Changed Type1" = Table.TransformColumnTypes(Source1,{{"User", type text}, {"Date", type date}}),
#"Filtered Rows" = Table.SelectRows(#"Changed Type1", each ([Date] = null)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Date"}),
#"Added Custom" = Table.AddColumn(#"Removed Columns", "Custom", each #"GoodRows"),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Date"}, {"Custom.Date"}),
#"Renamed Columns" = Table.RenameColumns(#"Expanded Custom",{{"Custom.Date", "Date"}}),
#"Appended Query" = Table.Combine({#"GoodRows", #"Renamed Columns"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Appended Query",{{"Date", type date}})
in
#"Changed Type2"
Proud to be a Super User!
Paul on Linkedin.
First of all, thank you for your answer..
I used the power query version (a little bit slow to refresh the data because I have data for all days, like 365 lines by user) , I just removed the duplicated values and perfect result.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 44 | |
| 43 | |
| 38 | |
| 18 | |
| 16 |
| User | Count |
|---|---|
| 67 | |
| 63 | |
| 30 | |
| 30 | |
| 23 |