The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello!
I am very new to Power Query and Power Pivot, although I am familiar with Power Automate and Power Apps and I am having a little bit of a problem with the formula language and logics to get some pretty simple things working
I am trying to get a % of variation from one semester to the next. I need to filter the 10 biggest variation.
Here is a very small example of my table (my full table has over 60.000 records, with 2000 products repeated in over 24 warehouses scattered through the past 8 semesters, SKU is composed by productID (10 digits) and warehouse number (3 digits). Every semester new data is uploaded with new values for all warehouses.
SKU Amount Date
10000000001104 1536956 30-06-2021
10000000001301 968452 31-12-2021
10000000001104 25492141 30-06-2022
10000000001301 326541 31-12-2022
10000000001104 174561 30-06-2023
10000000001301 14516541 31-12-2023
10000007451104 1541641 30-06-2021
10000007451910 15496411 31-12-2021
10000007451104 158741641 30-06-2022
10000007451910 158494 31-12-2022
10000007451104 68764 30-06-2023
10000007451910 1458496 31-12-2023
I tried writing a measure in Power Pivot, but all the formula ideas I have found online are not working... also tried creating a new table in Query with duplicates eliminated, but I can't figure it out.
Can you help me finding the right way to get this working?
Thank you!
Solved! Go to Solution.
Hi @fernandilus ,
What does “semester” mean when you say “from one semester to the next”? From what I understand, what you want to calculate is: for each different SKU, calculate the percentage change between Amount and last year's Amount value.
Here is the whole M function in the advanced editor:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fdBRDoMgEIThu/Asyc4yu7JnMd7/GsVaU7RQn0j884Vh2xLk+gBhWhKseJi3U5EsnlUUaV9uZRG0/+GVpkeIDB2GJ6nGUBC9qWOzqNsZXuZP+LnmSvObWMYiaHiat7RBl9mlw+1HGpD3KzGcwHR9p1pdCf8zv2crg9P5X9Tr6pyu70Aeoj/G7y8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [SKU = _t, Amount = _t, Date = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"SKU", Int64.Type}, {"Amount", Int64.Type}, {"Date", type text}}),
#"Changed Type with Locale" = Table.TransformColumnTypes(#"Changed Type", {{"Date", type date}}, "fr-BE"),
#"Changed Type1" = Table.TransformColumnTypes(#"Changed Type with Locale",{{"SKU", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "Custom", each let
currentSKU = [SKU],
currentDate = [Date],
previousYearDate = Date.AddYears(currentDate, -1),
matchedRow = Table.SelectRows(#"Changed Type1", (row) => row[SKU] = currentSKU and row[Date] = previousYearDate)
in
matchedRow),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Amount"}, {"Custom.Amount"}),
#"Added Custom1" = Table.AddColumn(#"Expanded Custom", "Custom", each if [Custom.Amount] = null then null else ([Amount] - [Custom.Amount]) / [Custom.Amount]),
#"Changed Type2" = Table.TransformColumnTypes(#"Added Custom1",{{"Custom", Percentage.Type}})
in
#"Changed Type2"
And the final output is as below:
To filter the 10 biggest variation, you can use the following code:
SortedTable = Table.Sort(Source,{{"Custom", Order.Descending}}),
Top10Rows = Table.FirstN(SortedTable,10)
Regarding creating a new table with duplicates eliminated, you can directly choose remove duplicate to achieve:
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
thanks a lot!!
your solution helped!
Hi @fernandilus ,
What does “semester” mean when you say “from one semester to the next”? From what I understand, what you want to calculate is: for each different SKU, calculate the percentage change between Amount and last year's Amount value.
Here is the whole M function in the advanced editor:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fdBRDoMgEIThu/Asyc4yu7JnMd7/GsVaU7RQn0j884Vh2xLk+gBhWhKseJi3U5EsnlUUaV9uZRG0/+GVpkeIDB2GJ6nGUBC9qWOzqNsZXuZP+LnmSvObWMYiaHiat7RBl9mlw+1HGpD3KzGcwHR9p1pdCf8zv2crg9P5X9Tr6pyu70Aeoj/G7y8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [SKU = _t, Amount = _t, Date = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"SKU", Int64.Type}, {"Amount", Int64.Type}, {"Date", type text}}),
#"Changed Type with Locale" = Table.TransformColumnTypes(#"Changed Type", {{"Date", type date}}, "fr-BE"),
#"Changed Type1" = Table.TransformColumnTypes(#"Changed Type with Locale",{{"SKU", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "Custom", each let
currentSKU = [SKU],
currentDate = [Date],
previousYearDate = Date.AddYears(currentDate, -1),
matchedRow = Table.SelectRows(#"Changed Type1", (row) => row[SKU] = currentSKU and row[Date] = previousYearDate)
in
matchedRow),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Amount"}, {"Custom.Amount"}),
#"Added Custom1" = Table.AddColumn(#"Expanded Custom", "Custom", each if [Custom.Amount] = null then null else ([Amount] - [Custom.Amount]) / [Custom.Amount]),
#"Changed Type2" = Table.TransformColumnTypes(#"Added Custom1",{{"Custom", Percentage.Type}})
in
#"Changed Type2"
And the final output is as below:
To filter the 10 biggest variation, you can use the following code:
SortedTable = Table.Sort(Source,{{"Custom", Order.Descending}}),
Top10Rows = Table.FirstN(SortedTable,10)
Regarding creating a new table with duplicates eliminated, you can directly choose remove duplicate to achieve:
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.