Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Maybe I can gain traction here from a dedicated Power Query Expert as I'd like to do this in Power Query.
I'd like to round to the nearest value based on values from another table and column.
The value may fall between two values, and I would like the value to round to the nearest one.
Table 1
Location | Round to Nearest Value |
A | 2 |
B | 3 |
C | 29 |
D | 15 |
E | 20 |
F | 27 |
G | 11 |
H | 7 |
I | 9 |
Table is the reference table used to round to the nearest value.
If the value from the table above falls between the values below, the round to the nearest value.
Table 2
0 |
5 |
10 |
20 |
25 |
30 |
Table 3 is the desired output.
For example location A Value 2 falls between 0 and 5, and is rounded to 0.
Location F Value 27 falls between 25 and 30, and is rounded to 25.
Table 3
Location | Value | Rounded to Nearest (Return this Column) |
A | 2 | 0 |
B | 3 | 5 |
C | 29 | 30 |
D | 15 | 20 |
E | 20 | 20 |
F | 27 | 25 |
G | 11 | 10 |
H | 7 | 5 |
I | 9 | 10 |
I hope I can get some help from a seasoned Power Query expert, as it's a perplexing problem for me. Maybe for even the season PQ user!
Thanks!
Solved! Go to Solution.
Solution Excel @ https://1drv.ms/x/s!Akd5y6ruJhvhuWkS2pLB0BHnAigv?e=DDSrQv
Use this for Table 1. See the working here - Open a blank query - Home - Advanced Editor - Remove everything from there and paste the below code to test (later on when you use the query on your dataset, you will have to change the source appropriately. If you have columns other than these, then delete Changed type step and do a Changed type for complete table from UI again)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTJSitWJVnICsozBLGeQmCWY6QJkGpqCma4gUQMw0w3ENAcz3UEKDMFMDyATIugJZAH1xwIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Location = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Location", type text}, {"Value", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Low", each Table.Last(Table.SelectRows(Table2, (x)=> x[Column1]<=[Value]))[Column1]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "High", each Table.First(Table.SelectRows(Table2, (x)=> x[Column1]>=[Value]))[Column1]),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Result", each if Number.Abs([Value]-[Low])>=Number.Abs([Value]-[High]) then [High] else [Low]),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom2",{"Low", "High"})
in
#"Removed Columns"
If you need test code for Table2
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1YlWMgWThhCOEZSCCBoDebEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}})
in
#"Changed Type"
Hi @roncruiser,
Sorry for late reply. It has been a crazy few days.
@Vijay_A_Verma method is very clean and quick if you have large dataset to handle.
Below are 2 different approaches:
1. Embeded calculation method
let
//Get Table 1
Source = Web.Page(Web.Contents("https://community.powerbi.com/t5/Power-Query/Rounding-to-the-nearest-Value-Based-on-Values-from-anot...")),
Table1 = Table.PromoteHeaders(Source{0}[Data]),
#"Changed Type" = Table.TransformColumnTypes(Table1,{{"Round to Nearest Value", Int64.Type}}),
//Get Table 2
GetTbl2 = Table.AddColumn(#"Changed Type", "Table 2 Value", each #"Table 2"),
//Add value to Table 2 for calculation
EmbedTbl2 = Table.AddColumn(GetTbl2, "Add Value", each Table.AddColumn([Table 2 Value], "Value", (r)=>[Round to Nearest Value],Int64.Type)),
//Calculate the different between value and table 2 (no negative value)
CalcVar = Table.AddColumn(EmbedTbl2, "Calc", each Table.AddColumn([Add Value], "Result", each Number.Abs([Value]-[Column1]))),
//Get the min number from the calculation
GetMin = Table.AddColumn(CalcVar, "GetMin", each Table.Min([Calc],"Result")),
//Expand the table 2 value (i.e. the nearest value)
#"Expanded GetMin" = Table.ExpandRecordColumn(GetMin, "GetMin", {"Column1"}, {"Column1"})
in
#"Expanded GetMin"
2. Traditional method (Group & Merge)
let
//Get Table 1
Source = Web.Page(Web.Contents("https://community.powerbi.com/t5/Power-Query/Rounding-to-the-nearest-Value-Based-on-Values-from-anot...")),
Table1 = Table.PromoteHeaders(Source{0}[Data]),
#"Changed Type" = Table.TransformColumnTypes(Table1,{{"Round to Nearest Value", type text}}),
//Use Fuzzy merge to get table 2 (Value has to be defined in text for merge)
#"Merged Queries" = Table.FuzzyNestedJoin(#"Changed Type", {"Round to Nearest Value"}, #"Table 2", {"Column1"}, "Table 2", JoinKind.LeftOuter, [IgnoreCase=true, IgnoreSpace=true]),
#"Expanded Table 2" = Table.ExpandTableColumn(#"Merged Queries", "Table 2", {"Column1"}, {"Column1"}),
//Change value to number
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Table 2",{{"Round to Nearest Value", type number}, {"Column1", type number}}),
//Calculate different between table 2 group to value
Result = Table.AddColumn(#"Changed Type1", "Result", each Number.Abs([Column1]-[Round to Nearest Value])),
//Group to get min
#"Grouped Rows" = Table.Group(Result, {"Location", "Round to Nearest Value"}, {{"Min", each List.Min([Result]), type number}}),
//Merge previous step - #"Grouped Rows" with step - Result
#"Merged Queries1" = Table.NestedJoin(#"Grouped Rows", {"Round to Nearest Value", "Min"}, Result, {"Round to Nearest Value", "Result"}, "Grouped Rows", JoinKind.LeftOuter),
#"Expanded Grouped Rows" = Table.ExpandTableColumn(#"Merged Queries1", "Grouped Rows", {"Column1"}, {"Column1"})
in
#"Expanded Grouped Rows"
Regards
KT
Thank You.
Yes, it has been a crazy couple weeks! @Vijay_A_Verma 's solution worked well for me.
I'll give yours a try as well. It looks very interesting from a very top level. I've not used fuzzy logic merging with Power Query before. It's good to have options!
Solution Excel @ https://1drv.ms/x/s!Akd5y6ruJhvhuWkS2pLB0BHnAigv?e=DDSrQv
Use this for Table 1. See the working here - Open a blank query - Home - Advanced Editor - Remove everything from there and paste the below code to test (later on when you use the query on your dataset, you will have to change the source appropriately. If you have columns other than these, then delete Changed type step and do a Changed type for complete table from UI again)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTJSitWJVnICsozBLGeQmCWY6QJkGpqCma4gUQMw0w3ENAcz3UEKDMFMDyATIugJZAH1xwIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Location = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Location", type text}, {"Value", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Low", each Table.Last(Table.SelectRows(Table2, (x)=> x[Column1]<=[Value]))[Column1]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "High", each Table.First(Table.SelectRows(Table2, (x)=> x[Column1]>=[Value]))[Column1]),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Result", each if Number.Abs([Value]-[Low])>=Number.Abs([Value]-[High]) then [High] else [Low]),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom2",{"Low", "High"})
in
#"Removed Columns"
If you need test code for Table2
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlCK1YlWMgWThhCOEZSCCBoDebEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}})
in
#"Changed Type"
Hi @roncruiser,
I have roughtly wrote a code to get a solution and there is one thing I want to clarify. Please look at below screenshot highlighted location D's nearest value can be either 10 or 20. Would you like to round up to higher value?
Regards
KT
@KT_Bsmart2gethe
That is a very good question, it depends on the content of the overall data and the value itself. For now, rounding up would be safer to implement.
Thank You KT!
Hi @roncruiser ,
I guess the question is whether your example data is a fair reflection of the quantity of rounding values you want to use.
If it is, then a simple IF statement would be the smart way to go:
if [value] < 2.5 then 0
else if [value] >= 2.5 and [value] < 7.5 then 5
else if...
Pete
Proud to be a Datanaut!
@BA_Pete
My next question would be how to reference table 2 for each row of table 1 to implement the IF statement.
I have not figured that part out yet. I tried the merge function but that still stumps me. I'm still going at it. No luck yet. Maybe the merge function is not the correct route.
Hi @roncruiser ,
My suggestion was to not reference table2 at all.
If table2 only actually contained the 6 values you showed in your example data, then an if statement would be the simplest way to go. Based on the fact that table2 can hold 150-300 conditional values, I'd recommend looking into Vijay's solution.
Pete
Proud to be a Datanaut!
@BA_Pete
The quantity of rounding values for the reference table, Table 2 is relatively accurate. Actual quantity of table 2 values will likely not exceed 150 values in the column. If it does, it will not exceed more than 300 values in rare cases.
Table one values may exceed more than 500 but no more than 1000 rows.
Thanks for quick the response. Will respond again soon.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.