Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
How can I calculare the running difference in "Power Query". In fact I need to do in Dataflow of fabric so asking Power Query.
I have column values Asset, AsOfDate, Name and Value data. I added Row ID just so my example is clear. I need to calculare Depreciare column value in Power Query. I don't know how I can do that.
The story is in 1/1/2024 someone got new laptop of $2000, and the value of laptop goes down $50 every future months. At one point value will be zero but below is small example to find a solution.
| Row ID | Asset | AsOfDate | Name | Value | Depreciate |
| 1 | Laptop | 1/1/2024 | Price | 2000 | 0 |
| 1 | Laptop | 1/1/2024 | 2/1/2024 | 50 | 1950 |
| 1 | Laptop | 1/1/2024 | 3/1/2024 | 50 | 1900 |
| 1 | Laptop | 1/1/2024 | 4/1/2024 | 50 | 1850 |
| 1 | Laptop | 1/1/2024 | 5/1/2024 | 50 | 1800 |
| 2 | Desk | 1/1/2024 | Price | 500 | 0 |
| 2 | Desk | 1/1/2024 | 2/1/2024 | 10 | 490 |
| 2 | Desk | 1/1/2024 | 3/1/2024 | 10 | 480 |
| 2 | Desk | 1/1/2024 | 4/1/2024 | 10 | 470 |
| 2 | Desk | 1/1/2024 | 5/1/2024 | 10 | 460 |
| 3 | TV | 1/1/2024 | Price | 1500 | 0 |
| 3 | TV | 1/1/2024 | 2/1/2024 | 30 | 1470 |
| 3 | TV | 1/1/2024 | 3/1/2024 | 30 | 1440 |
| 3 | TV | 1/1/2024 | 4/1/2024 | 30 | 1410 |
| 3 | TV | 1/1/2024 | 5/1/2024 | 30 | 1380 |
Thank you,
AK
Solved! Go to Solution.
In Power Query, you can use List.Generate to create a list of the values, then add that as a column. You would do this as an aggregation within the Table.Group function.
You would also need to manually zero the first entry to get your desired result.
You can group by Asset eliminating the need for the row column.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8kksKMkvUNJRMtQ31DcyMDIBMgOKMpNTgbSRgYGBUqwOdkVGCKYpblXGRKkyIUqVKYYql9TibOxONzXApQLJ3Ya41BgTocaECDWmGGpCwrC71xDmYHQFSM41xq7CmKAKE4IqTFFVxAIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Asset"}, {
{"Deprec", (t)=> let
#"Deprec List" =
List.Generate(
()=>[d=t[Value]{0}, idx=0],
each [idx] < Table.RowCount(t),
each [d=[d]-t[Value]{[idx]+1}, idx=[idx]+1],
each [d]),
#"Zero First" = List.ReplaceRange(#"Deprec List",0,1,{0}),
#"Add as Column" = Table.FromColumns(
Table.ToColumns(t)
& {#"Zero First"},
type table [Asset=nullable text, AsOfDate=nullable date, Name=nullable text, Value=nullable number, Depreciation=nullable number])
in
#"Add as Column",
type table [Asset=nullable text, AsOfDate=nullable date, Name=nullable text, Value=nullable number, Depreciation=nullable number]
}}),
#"Expanded Deprec" = Table.ExpandTableColumn(#"Grouped Rows", "Deprec", {"AsOfDate", "Name", "Value", "Depreciation"})
in
#"Expanded Deprec"
Hi, @ojhaashok
another solution just with Table.NestedJoin
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8kksKMkvUNJRMtQ31DcyMDIBMgOKMpNTgbSRgYGBUqwOdkVGCKYpblXGRKkyIUqVKYYql9TibOxONzXApQLJ3Ya41BgTocaECDWmGGpCwrC71xDmYHQFSM41xq7CmKAKE4IqTFFVxAIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
Month = Table.AddColumn(ChangedType, "Month", each try Date.From([Name]) otherwise null),
Join = Table.NestedJoin(Month, {"Asset", "AsOfDate"}, #"Month", {"Asset", "AsOfDate"}, "Join", JoinKind.LeftOuter),
Depreciate = Table.AddColumn(Join, "Depreciate",
(x) => x[Join]{[Name ="Price"]}[Value] - List.Sum(Table.SelectRows(x[Join], each [Month]<=x[Month])[Value]))
in
Depreciate
Stéphane
Hi, @ojhaashok
another solution just with Table.NestedJoin
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8kksKMkvUNJRMtQ31DcyMDIBMgOKMpNTgbSRgYGBUqwOdkVGCKYpblXGRKkyIUqVKYYql9TibOxONzXApQLJ3Ya41BgTocaECDWmGGpCwrC71xDmYHQFSM41xq7CmKAKE4IqTFFVxAIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
Month = Table.AddColumn(ChangedType, "Month", each try Date.From([Name]) otherwise null),
Join = Table.NestedJoin(Month, {"Asset", "AsOfDate"}, #"Month", {"Asset", "AsOfDate"}, "Join", JoinKind.LeftOuter),
Depreciate = Table.AddColumn(Join, "Depreciate",
(x) => x[Join]{[Name ="Price"]}[Value] - List.Sum(Table.SelectRows(x[Join], each [Month]<=x[Month])[Value]))
in
Depreciate
Stéphane
Hi @ojhaashok, anothere solution (similar to @ronrsnfld, but with List.Accumulate)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8kksKMkvUNJRMtQ31DcyMDIBMgOKMpNTgbSRgYGBUqwOdkVGCKYpblXGRKkyIUqVKYYql9TibOxONzXApQLJ3Ya41BgTocaECDWmGGpCwrC71xDmYHQFSM41xq7CmKAKE4IqTFFVxAIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
fn_Depreciate =
(myTable as table)=>
let
// Tbl = GroupedRows{[Asset="Laptop"]}[All],
Tbl = myTable,
ValueBuffer = List.Buffer(Tbl[Value]),
AccumulatedDepreciate = List.Accumulate(
{ 0..List.Count(ValueBuffer)-1 },
{},
(s,c)=> s & { if c = 0 then 0 else if c = 1 then ValueBuffer{0} - ValueBuffer{c} else s{c-1} - ValueBuffer{c} } ),
ToTable = Table.FromColumns(Table.ToColumns(Tbl) & {AccumulatedDepreciate}, Value.Type(Tbl & #table(type table[Depreciate = number],{})))
in
ToTable,
GroupedRows = Table.Group(ChangedType, {"Asset"}, {{"All", fn_Depreciate, type table}}),
CombinedAll = Table.Combine(GroupedRows[All])
in
CombinedAll
Hi @ojhaashok ,
@ronrsnfld Thanks for your concern about this case!
Another solution:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jc+rCoBAEIXhd9ksOJedNzAaDGKRDSIGMSjq++MUUdl1mXT+8JXT9w5d4ephO9dNA0ssCchrNvs8TroEAC4U/5CelLxks/RmKZEkzWo6lvQdgZx6fcGcY6PzRieRY822S3/A+0QKvS7wv2KT8iYlXxUu", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Row ID" = _t, Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Row ID", Int64.Type}, {"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Row ID"}, {{"Count", each Table.AddIndexColumn(_, "Index", 1, 1, Int64.Type)}}),
#"Expanded Count" = Table.ExpandTableColumn(#"Grouped Rows", "Count", {"Asset", "AsOfDate", "Name", "Value", "Index"}, {"Asset", "AsOfDate", "Name", "Value", "Index"}),
#"Added Custom" = Table.AddColumn(#"Expanded Count", "Custom", each
let
currentID = [Row ID],
currentIndex = [Index]
in
Table.SelectRows(#"Expanded Count", each ([Row ID] = currentID and [Name] = "Price")){0}[Value] -
(if [Name] = "Price" then
[Value]
else
List.Sum(
Table.SelectRows(
#"Expanded Count",
each [Row ID] = currentID and [Index] <= currentIndex and [Index] <> 1
)[Value]
)))
in
#"Added Custom"
And the final output is as below:
Best Regards,
Dino Tao
If this post helps, then please consider Accept both of the answers as the solution to help the other members find it more quickly.
In Power Query, you can use List.Generate to create a list of the values, then add that as a column. You would do this as an aggregation within the Table.Group function.
You would also need to manually zero the first entry to get your desired result.
You can group by Asset eliminating the need for the row column.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8kksKMkvUNJRMtQ31DcyMDIBMgOKMpNTgbSRgYGBUqwOdkVGCKYpblXGRKkyIUqVKYYql9TibOxONzXApQLJ3Ya41BgTocaECDWmGGpCwrC71xDmYHQFSM41xq7CmKAKE4IqTFFVxAIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Asset = _t, AsOfDate = _t, Name = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Asset", type text}, {"AsOfDate", type date}, {"Name", type text}, {"Value", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Asset"}, {
{"Deprec", (t)=> let
#"Deprec List" =
List.Generate(
()=>[d=t[Value]{0}, idx=0],
each [idx] < Table.RowCount(t),
each [d=[d]-t[Value]{[idx]+1}, idx=[idx]+1],
each [d]),
#"Zero First" = List.ReplaceRange(#"Deprec List",0,1,{0}),
#"Add as Column" = Table.FromColumns(
Table.ToColumns(t)
& {#"Zero First"},
type table [Asset=nullable text, AsOfDate=nullable date, Name=nullable text, Value=nullable number, Depreciation=nullable number])
in
#"Add as Column",
type table [Asset=nullable text, AsOfDate=nullable date, Name=nullable text, Value=nullable number, Depreciation=nullable number]
}}),
#"Expanded Deprec" = Table.ExpandTableColumn(#"Grouped Rows", "Deprec", {"AsOfDate", "Name", "Value", "Depreciation"})
in
#"Expanded Deprec"
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 19 | |
| 9 | |
| 8 | |
| 7 | |
| 6 |