We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Completely new to power query so forgive my ignorance.
I'm trying to get a table to update every 24 hours without duplicating entries.
The table has a unique field which should only appear once, followed by several columns of associated data.
I've managed to get a power query to populate the table initially, but on refresh it just seems to add the same data again.
I would like it to add new rows where the unique field does not exist, and where the field DOES exist, update that specific row to the newly refreshed data (not bothered if it's changed or not, just overwrite whatever the current values are)
can anyone help me with this?
Solved! Go to Solution.
Hi @Syndicate_Admin,
The solution maybe somethinglike this (please be aware on large datasets this can be slow):
let
ExistingData = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTIxV4rViVYyAjLNjMBMY4SoCZBpaghmmgKZlkAFsQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Key = _t, Value = _t]),
NewData = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlHSAeJYnWglUyDL0hTMNAMyTQ2VYmMB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Key = _t, Value = _t]),
LeaveExisting = Table.NestedJoin(ExistingData, {"Key"}, NewData, {"Key"}, "NewData", JoinKind.LeftAnti),
#"Removed Columns" = Table.Combine({Table.RemoveColumns(LeaveExisting,{"NewData"}), NewData})
in
#"Removed Columns"
But, depending on your requirements, I would investiagte in the space of incremental refresh this could be a better/more effective solution to what you are trying to achieve.
Kind regards,
John
Hi @Syndicate_Admin,
The solution maybe somethinglike this (please be aware on large datasets this can be slow):
let
ExistingData = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTIxV4rViVYyAjLNjMBMY4SoCZBpaghmmgKZlkAFsQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Key = _t, Value = _t]),
NewData = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlHSAeJYnWglUyDL0hTMNAMyTQ2VYmMB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Key = _t, Value = _t]),
LeaveExisting = Table.NestedJoin(ExistingData, {"Key"}, NewData, {"Key"}, "NewData", JoinKind.LeftAnti),
#"Removed Columns" = Table.Combine({Table.RemoveColumns(LeaveExisting,{"NewData"}), NewData})
in
#"Removed Columns"
But, depending on your requirements, I would investiagte in the space of incremental refresh this could be a better/more effective solution to what you are trying to achieve.
Kind regards,
John
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 5 | |
| 3 | |
| 3 | |
| 2 | |
| 2 |
| User | Count |
|---|---|
| 10 | |
| 8 | |
| 7 | |
| 7 | |
| 5 |