Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi Comunity,
I wanna to make a table like this table in excel:
In case:
Index 0: Ending Value = 12
Index 1: Begin Value = Previous Ending Value = 12, Import Value = 13, Export Value = (Begin + Import)/2 = 12.5, Ending Value = Begin + Import - Export = 12.5
Index 2: Begin Value = Previous Ending Value = 12.5, Import Value = 11, Export Value = (Begin + Import)/2 = 11.75, Ending Value = Begin + Import - Export = 11.75
....
I try do it in PBI but get Circular Dependency on Col Ending.
Please help me make that table!!
Thanks
Solved! Go to Solution.
Hi @SonLeThai ,
Here is my sample data:
Then you can try this:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUQIiQyOlWJ1oJUMQ0xgkBOIZgXiGMB5I2NAExgMxDE3BvFgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Index = _t, Import = _t, Value = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Index", Int64.Type}, {"Import", Int64.Type}, {"Value", Int64.Type}}),
InitialEnding = 12,
AccumulateColumns = List.Accumulate(
List.Zip({ChangedType[Index], ChangedType[Import]}),
{[Index=0, Begin=null, Export=null, Ending=InitialEnding]},
(state, current) =>
let
CurrentIndex = current{0},
CurrentImport = current{1},
PreviousEnding = if CurrentIndex = 0 then null else state{CurrentIndex}[Ending],
CurrentExport = if CurrentIndex = 0 then null else (PreviousEnding + CurrentImport) / 2,
CurrentEnding = if CurrentIndex = 0 then InitialEnding else PreviousEnding + CurrentImport - CurrentExport,
CurrentRecord = [Index=CurrentIndex, Begin=PreviousEnding, Export=CurrentExport, Ending= CurrentEnding]
in
state & {CurrentRecord}
),
AccumulatedTable = Table.FromRecords(AccumulateColumns),
JoinedTable = Table.NestedJoin(ChangedType,{"Index"},AccumulatedTable,{"Index"},"NewColumns",JoinKind.LeftOuter),
ExpandedTable = Table.ExpandTableColumn(JoinedTable, "NewColumns", {"Begin", "Export", "Ending"}),
FinalTable = ExpandedTable,
#"Removed Top Rows" = Table.Skip(FinalTable,1)
in
#"Removed Top Rows"
Put all of the M code into Advanced Editor, and the final output is below:
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @SonLeThai ,
Here is my sample data:
Then you can try this:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlDSUQIiQyOlWJ1oJUMQ0xgkBOIZgXiGMB5I2NAExgMxDE3BvFgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Index = _t, Import = _t, Value = _t]),
ChangedType = Table.TransformColumnTypes(Source,{{"Index", Int64.Type}, {"Import", Int64.Type}, {"Value", Int64.Type}}),
InitialEnding = 12,
AccumulateColumns = List.Accumulate(
List.Zip({ChangedType[Index], ChangedType[Import]}),
{[Index=0, Begin=null, Export=null, Ending=InitialEnding]},
(state, current) =>
let
CurrentIndex = current{0},
CurrentImport = current{1},
PreviousEnding = if CurrentIndex = 0 then null else state{CurrentIndex}[Ending],
CurrentExport = if CurrentIndex = 0 then null else (PreviousEnding + CurrentImport) / 2,
CurrentEnding = if CurrentIndex = 0 then InitialEnding else PreviousEnding + CurrentImport - CurrentExport,
CurrentRecord = [Index=CurrentIndex, Begin=PreviousEnding, Export=CurrentExport, Ending= CurrentEnding]
in
state & {CurrentRecord}
),
AccumulatedTable = Table.FromRecords(AccumulateColumns),
JoinedTable = Table.NestedJoin(ChangedType,{"Index"},AccumulatedTable,{"Index"},"NewColumns",JoinKind.LeftOuter),
ExpandedTable = Table.ExpandTableColumn(JoinedTable, "NewColumns", {"Begin", "Export", "Ending"}),
FinalTable = ExpandedTable,
#"Removed Top Rows" = Table.Skip(FinalTable,1)
in
#"Removed Top Rows"
Put all of the M code into Advanced Editor, and the final output is below:
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
The only way you can do this is in Power Query using List.Accumulate.
I try List.Accumulate but it not working right.
I try to make it by While Loop in Power Editor but i think it will be slow with large data.
Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).
Do not include sensitive information or anything not related to the issue or question.
If you are unsure how to upload data please refer to https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...
Please show the expected outcome based on the sample data you provided.
Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
104 | |
98 | |
97 | |
38 | |
38 |
User | Count |
---|---|
153 | |
122 | |
76 | |
73 | |
65 |