Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi, I am new to this community and trying to get along with PowerBI especially PowerQueries regarding m syntax.
I have one fixed table that consists of the basic fare for transaction costs depending on transaction count that is spread through buckets as:
It seems simple, but I am struggling to generate rows with calculated cost per bucket which will depend on the increased percentage rate.. What I have done so far is managed to create a query as this (with a fixed increase rate),
but this only generates one additional row for the next year... I need some way to generate multiple rows for the next X years with the possibility to increase each cost by X percent (in case it is changed in the future).
I have thought of some ideas to generate a list with years that would cross-join with this table and try to generate cost using the years from the list as parameters... but not sure how am I supposed to proceed.
I would be glad for any feedback. Thanks in advance!
Solved! Go to Solution.
Hi @Tamarah_, You can use this as an example, which you can adopt to your scenario:
let
PricingTable =
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTICYmMgNgFiUxDfwMhYKTYWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Bucket 1" = _t, #"Bucket 2" = _t, #"Bucket 3" = _t, #"Bucket 4" = _t, #"Bucket 5" = _t, FY = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Bucket 1", Int64.Type}, {"Bucket 2", Int64.Type}, {"Bucket 3", Int64.Type}, {"Bucket 4", Int64.Type}, {"Bucket 5", Int64.Type}, {"FY", Int64.Type}})
in #"Changed Type",
IndexTable =
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwMlHSUTLUM7BQitUB803BfEMDGN8MIm+qFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [FY = _t, Index = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"FY", Int64.Type}, {"Index", type number}})
in #"Changed Type",
Process = List.Accumulate( Table.ToRecords(IndexTable),
Table.ToRecords(PricingTable),
(a, n) => a &
{
Record.TransformFields(
List.Last(a),
List.Transform(
List.Select(
Table.ColumnNames(PricingTable),
each Text.Contains(_, "Bucket")
),
each {_, (x) => x * n[Index]}
) &
{{"FY", each n[FY]}}
)
}
),
Convert = Table.FromRecords(Process, Value.Type(PricingTable))
in Convert
Just copy and paste it into a Blank Query to see how it works.
The main work is happening in the Process step, which:
1. Filters column names from the main table using "bucket" as a driver, we need this to further apply indexing. (List.Select)
2. Generated a transformaiton rules for all bucket (using the list created above) and FY columns (List.Transform)
3. Applies rules on the last current row in the main table and append the result to the bottom of the table (List.Accumulate)
Kind regards,
John
Hey @jbwtp,
I was able to implement this into my existing table and now looks like this (where I use the starting point for the calculations from my initial table FY24)
I was having few more questions about this:
I hope my concerns make sense,
Thanks!
Hi @Tamarah_,
1. This is provided i n the IndexTable. Every row in this table represents a FY and COLA index. If you add more to there, there will be more lines in the output table. Please note that FY is also taken from this table rather than calculated, so it is possible to make "gaps".
2. The answer is in the IndexTable too. If you make it an Excel file and link to your dataset, the end user will be able to edit the file by updating indexes and adding FYs, which then will be imported to PBI and added to the final table. Hope this makes sense,.
Cheers,
John
Hi @Tamarah_, You can use this as an example, which you can adopt to your scenario:
let
PricingTable =
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTICYmMgNgFiUxDfwMhYKTYWAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Bucket 1" = _t, #"Bucket 2" = _t, #"Bucket 3" = _t, #"Bucket 4" = _t, #"Bucket 5" = _t, FY = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Bucket 1", Int64.Type}, {"Bucket 2", Int64.Type}, {"Bucket 3", Int64.Type}, {"Bucket 4", Int64.Type}, {"Bucket 5", Int64.Type}, {"FY", Int64.Type}})
in #"Changed Type",
IndexTable =
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwMlHSUTLUM7BQitUB803BfEMDGN8MIm+qFBsLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [FY = _t, Index = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"FY", Int64.Type}, {"Index", type number}})
in #"Changed Type",
Process = List.Accumulate( Table.ToRecords(IndexTable),
Table.ToRecords(PricingTable),
(a, n) => a &
{
Record.TransformFields(
List.Last(a),
List.Transform(
List.Select(
Table.ColumnNames(PricingTable),
each Text.Contains(_, "Bucket")
),
each {_, (x) => x * n[Index]}
) &
{{"FY", each n[FY]}}
)
}
),
Convert = Table.FromRecords(Process, Value.Type(PricingTable))
in Convert
Just copy and paste it into a Blank Query to see how it works.
The main work is happening in the Process step, which:
1. Filters column names from the main table using "bucket" as a driver, we need this to further apply indexing. (List.Select)
2. Generated a transformaiton rules for all bucket (using the list created above) and FY columns (List.Transform)
3. Applies rules on the last current row in the main table and append the result to the bottom of the table (List.Accumulate)
Kind regards,
John