Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
=IFERROR(IF(SUMIFS(Amountindoccurr:Amountindoccurr,Remove Alphapets:Remove Alphapets,Remove Alphapets2)>ABS(INDEX(Reference:Remove Alphapets,MATCH(Remove Alphapets2,Remove Alphapets:Remove Alphapets,0),1)),"High Priority",""),"")
excel formula is =IFERROR(IF(ABS(SUMIFS(M:M,U:U,U2))>ABS(INDEX(M:U,MATCH(U2,U:U,0),1)),"High Priority",""),"") I need to apply same formula in Power query
Reference | Amountindoccurr | Column1 | Remove Alphapets | Sumif Amount | Amountcopy |
128404JAIMES | -5400 | 1284045400 | 5400 | ||
128404JAIMES | 5400 | 1284045400 | 5400 | ||
7001587798 | 7975.5 | 1284047975.5 | High Priority | 7975.5 | |
7001587798 | 7975.5 | 1284047975.5 | High Priority | 7975.5 | |
7001587798 | 7975.5 | 1284047975.5 | High Priority | 7975.5 | |
11332716 | -5018.4 | 1284045018.4 | 5018.4 | ||
11332716 | -5018.4 | 1284045018.4 | 5018.4 | ||
-10036.8 | 12840410036.8 | 10036.8 |
Solved! Go to Solution.
Hi @KuntalSingh
Thanks for your reply, you can try the following code.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSyMDEw8XL09HUNVtJR0jU1MTAA0hBhKAdMxepgKMar1tzAwNDUwtzc0gIoam5pbqpnClcL50IZ9FBvaGhsbGRuaAb2pIGhhZ4JwukwLpRBlnIjU2MTqI1ACoTQxc1g4rEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Reference = _t, Amountindoccurr = _t, #"Remove Alphapets" = _t, Amountcopy = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Reference", type text}, {"Amountindoccurr", type number}, {"Remove Alphapets", type number}, {"Amountcopy", type number}}),
#"Changed Type1" = Table.TransformColumnTypes(#"Changed Type",{{"Amountindoccurr", type text}}),
Custom1 = Table.TransformColumns(#"Changed Type1",{{"Amountindoccurr",each if Text.Contains(_,"-") then _ else "+"&_ }}),
#"Grouped Rows" = Table.Group(Custom1, {"Reference"}, {{"Data", each Text.Combine([Amountindoccurr],",")},{"Tables", each _, type table [Reference=nullable text, Amountindoccurr=text, Remove Alphapets=nullable number, Amountcopy=nullable number]},{"Counts", each Table.RowCount(_)}}),
#"Expanded Tables" = Table.ExpandTableColumn(#"Grouped Rows", "Tables", {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}, {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}),
#"Added Custom" = Table.AddColumn(#"Expanded Tables", "Custom", each let a=Text.Split([Data],","),
b=[Amountindoccurr],
c=List.Count(List.Select(a, each Text.Contains(_,b))),
d=if Text.Contains([Data],"-") and Text.Contains([Data],"+") then null else if [Counts]=c then "High Priority" else null
in d),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Data"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Removed Columns",{{"Amountindoccurr", Int64.Type}})
in
#"Changed Type2"
Output
Best Regards!
Yolo Zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @KuntalSingh
Thanks for your reply, you can try the following code.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSyMDEw8XL09HUNVtJR0jU1MTAA0hBhKAdMxepgKMar1tzAwNDUwtzc0gIoam5pbqpnClcL50IZ9FBvaGhsbGRuaAb2pIGhhZ4JwukwLpRBlnIjU2MTqI1ACoTQxc1g4rEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Reference = _t, Amountindoccurr = _t, #"Remove Alphapets" = _t, Amountcopy = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Reference", type text}, {"Amountindoccurr", type number}, {"Remove Alphapets", type number}, {"Amountcopy", type number}}),
#"Changed Type1" = Table.TransformColumnTypes(#"Changed Type",{{"Amountindoccurr", type text}}),
Custom1 = Table.TransformColumns(#"Changed Type1",{{"Amountindoccurr",each if Text.Contains(_,"-") then _ else "+"&_ }}),
#"Grouped Rows" = Table.Group(Custom1, {"Reference"}, {{"Data", each Text.Combine([Amountindoccurr],",")},{"Tables", each _, type table [Reference=nullable text, Amountindoccurr=text, Remove Alphapets=nullable number, Amountcopy=nullable number]},{"Counts", each Table.RowCount(_)}}),
#"Expanded Tables" = Table.ExpandTableColumn(#"Grouped Rows", "Tables", {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}, {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}),
#"Added Custom" = Table.AddColumn(#"Expanded Tables", "Custom", each let a=Text.Split([Data],","),
b=[Amountindoccurr],
c=List.Count(List.Select(a, each Text.Contains(_,b))),
d=if Text.Contains([Data],"-") and Text.Contains([Data],"+") then null else if [Counts]=c then "High Priority" else null
in d),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Data"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Removed Columns",{{"Amountindoccurr", Int64.Type}})
in
#"Changed Type2"
Output
Best Regards!
Yolo Zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @KuntalSingh ,
Based on your formula
INDEX(Reference:Remove Alphapets,MATCH(Remove Alphapets2,Remove Alphapets:Remove Alphapets,0),1))
it will return the "128404JAIMES", is it wrong?
Best Regards!
Yolo Zhu
I need
same reference number have + & - value in column Amountindoccurr then comment is null and same reference column have ++ or -- value in Amountindoccurr then comment is hight priority
I need
same reference number have + & - value in column Amountindoccurr then comment is null and same reference column have ++ or -- value in Amountindoccurr then comment is hight priority
Hi @KuntalSingh
Thanks for your quick reply ,based on your description, is the 11332716 be high priority, it has "--" value
Best Regards!
Yolo Zhu
Yes please
Hi @KuntalSingh
You can create a blank query and put the following code to advanced editor.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSyMDEw8XL09HUNVtJR0jU1MTAA0hBhKAdMxepgKMar1tzAwNDUwtzc0gIoam5pbqpnClcL50IZ9FBvaGhsbGRuaAb2pIGhhZ4JwukwLpRBqvJYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Reference = _t, Amountindoccurr = _t, #"Remove Alphapets" = _t, Amountcopy = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Reference", type text}, {"Amountindoccurr", type number}, {"Remove Alphapets", type number}, {"Amountcopy", type number}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each let a=[Reference],
b=Table.SelectRows(#"Changed Type",each [Reference]=a)[Amountindoccurr],
c=List.Count(List.Select(b,each _>0)),
d=List.Count(List.Select(b,each _<0)),
e=List.Count(b)
in if c=e or d=e then "High Priority" else null)
in
#"Added Custom"
Output
Best Regards!
Yolo Zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I used above mention code it keeps on loading from last 20 mins also data size increase from MB to GB. Please help to optimised the loading time as we have more than 60K rows
Hi @KuntalSingh
You can try the following code.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSyMDEw8XL09HUNVtJR0jU1MTAA0hBhKAdMxepgKMar1tzAwNDUwtzc0gIoam5pbqpnClcL50IZ9FBvaGhsbGRuaAb2pIGhhZ4JwukwLpRBqvJYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Reference = _t, Amountindoccurr = _t, #"Remove Alphapets" = _t, Amountcopy = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Reference", type text}, {"Amountindoccurr", type number}, {"Remove Alphapets", type number}, {"Amountcopy", type number}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Reference"}, {{"Count>0", each List.Count(List.Select([Amountindoccurr],each _>0))}, {"Count<0", each List.Count(List.Select([Amountindoccurr],each _<0))},{"Count_rows", each Table.RowCount(_), Int64.Type}, {"Data", each _, type table [Reference=nullable text, Amountindoccurr=nullable number, Remove Alphapets=nullable number, Amountcopy=nullable number]}}),
#"Expanded Data" = Table.ExpandTableColumn(#"Grouped Rows", "Data", {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}, {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}),
#"Added Custom" = Table.AddColumn(#"Expanded Data", "Custom", each if [#"Count>0"]=[Count_rows] or [#"Count<0"]=[Count_rows] then "High Priority" else null),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count>0", "Count<0", "Count_rows"})
in
#"Removed Columns"let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSyMDEw8XL09HUNVtJR0jU1MTAA0hBhKAdMxepgKMar1tzAwNDUwtzc0gIoam5pbqpnClcL50IZ9FBvaGhsbGRuaAb2pIGhhZ4JwukwLpRBqvJYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Reference = _t, Amountindoccurr = _t, #"Remove Alphapets" = _t, Amountcopy = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Reference", type text}, {"Amountindoccurr", type number}, {"Remove Alphapets", type number}, {"Amountcopy", type number}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Reference"}, {{"Count>0", each List.Count(List.Select([Amountindoccurr],each _>0))}, {"Count<0", each List.Count(List.Select([Amountindoccurr],each _<0))},{"Count_rows", each Table.RowCount(_), Int64.Type}, {"Data", each _, type table [Reference=nullable text, Amountindoccurr=nullable number, Remove Alphapets=nullable number, Amountcopy=nullable number]}}),
#"Expanded Data" = Table.ExpandTableColumn(#"Grouped Rows", "Data", {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}, {"Amountindoccurr", "Remove Alphapets", "Amountcopy"}),
#"Added Custom" = Table.AddColumn(#"Expanded Data", "Custom", each if [#"Count>0"]=[Count_rows] or [#"Count<0"]=[Count_rows] then "High Priority" else null),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count>0", "Count<0", "Count_rows"})
in
#"Removed Columns"
Best Regards!
Yolo Zhu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Please help on one more logic
same number in CC column have + and -
value in Amountindoccurr column means nullyfi then comment is blank and same number in CC column have same value either + or - in Amountindoccurr column then comment is Highy priority
Excel formula
1.Apply formula in Column CC to remove alphabets from Reference(L Column) AND Concatenate with ABS(amt docu currency) after Please implement your logic to get the output
Scripts | Vendor | VendorName | BusinessArea | CompanyCode | FiscalYear | DocumentType | DocumentDate | PostingKey | PostingDate | DocumentNumber | Reference | Amountindoccurr | DocumentCurrency | Amountinlocalcurr | LocalCurrency | ClearingDocument | ClearingDate | Text | NetDueDate | Concatenation | CC |
S(25-60)-1 | 2004235596 | G&G OUTFITTERS INC | US05 | 2024 | RE | 45331 | 31 | 45530 | 5.1E+09 | 242772 | -166.75 | USD | -166.75 | USD | 4/9/2024 | 51001530992024US05 | 242772166.75 | ||||
S(25-60)-1 | 2000011986 | G&G OUTFITTERS INC | US05 | 2024 | RE | 45331 | 31 | 45338 | 5.1E+09 | 242772 | -166.75 | USD | -166.75 | USD | 2000546772 | 4/14/2024 | 4/9/2024 | 51000427342024US05 | 242772166.75 | ||
S(25-60)-1 | 2003871958 | CINTAS CORPORATION N | US01 | 2024 | RE | 45293 | 31 | 45530 | 5.1E+09 | 4178789859 | -205.94 | USD | -205.94 | USD | 3/2/2024 | 51018370002024US01 | 4178789859205.94 | ||||
S(25-60)-1 | 2000029014 | CINTAS CORPORATION 100 | US01 | 2024 | RE | 45293 | 31 | 45510 | 5.1E+09 | 4178789859 | -205.94 | USD | -205.94 | USD | 3/2/2024 | 51005247402024US01 | 4178789859205.94 | ||||
S(25-60)-1 | 2003871958 | CINTAS CORPORATION N | US01 | 2024 | RE | 45307 | 31 | 45530 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 3/16/2024 | 51000625092024US01 | 4180383750209.39 | ||||
S(25-60)-1 | 2000011956 | CINTAS CORPORATION NO 2 | US01 | 2024 | RE | 45307 | 31 | 45506 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 3/16/2024 | 51009005242024US01 | 4180383750209.39 | ||||
S(25-60)-2 | 2000014020 | NYSCO PRODUCTS LLC | US05 | 2024 | RE | 44592 | 21 | 45530 | 5.1E+09 | 0236725-IN | 91.4 | USD | 91.4 | USD | VR:Full Accrual | 4/1/2022 | 51008240022024US05 | 0236725-IN91.4 | |||
S(25-60)-2 | 2000014020 | NYSCO PRODUCTS LLC | US05 | 2024 | RE | 44592 | 31 | 45457 | 5.1E+09 | 0236725-IN | -91.4 | USD | -91.4 | USD | 51010235452024US05 | 0236725-IN91.4 | |||||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45007 | 21 | 45530 | 5.1E+09 | 9648709799 | 217.97 | USD | 217.97 | USD | VR:Pay-as-billed | 7/20/2023 | 51003365112024US01 | 9648709799217.97 | |||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45007 | 31 | 45422 | 5.1E+09 | 9648709799 | -217.97 | USD | -217.97 | USD | VR:Pay-as-billed | 7/20/2023 | 51012950142024US01 | 9648709799217.97 | |||
S(25-60)-2 | 2004015811 | CINTAS CORPORATION N | US05 | 2024 | RE | 45281 | 31 | 45530 | 5.1E+09 | I312210921 | -234.72 | USD | -234.72 | USD | VR:Pay-as-billed | 2/19/2024 | 51024855002024US05 | I312210921234.72 | |||
S(25-60)-2 | 2000031331 | CINTAS FIRE PROTECTION | US24 | 2024 | RE | 45281 | 21 | 45320 | 5.1E+09 | I312210921 | 234.72 | USD | 234.72 | USD | 100831507 | 4/3/2024 | VR:Pay-as-billed | 2/19/2024 | 51000080072024US24 | I312210921234.72 | |
S(25-60)-2 | 2000031331 | CINTAS FIRE PROTECTION | US24 | 2024 | RE | 45281 | 31 | 45299 | 5.1E+09 | I312210921 | -234.72 | USD | -234.72 | USD | 100831507 | 4/3/2024 | VR:Pay-as-billed | 2/19/2024 | 51000540022024US24 | I312210921234.72 | |
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45281 | 31 | 45530 | 5.1E+09 | 9943375106 | -207.97 | USD | -207.97 | USD | 4/19/2024 | 51003750022024US01 | 9943375106207.97 | ||||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2023 | RE | 45281 | 31 | 45288 | 5.1E+09 | 9943375106 | -207.97 | USD | -207.97 | USD | 4/19/2024 | 51013625832023US01 | 9943375106207.97 | ||||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45281 | 31 | 45457 | 5.1E+09 | 9943375106 | -207.97 | USD | -207.97 | USD | 4/19/2024 | 51011440152024US01 | 9943375106207.97 | ||||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45281 | 21 | 45488 | 5.1E+09 | 9943375106 | 207.97 | USD | 207.97 | USD | MR8M | 4/19/2024 | 51005320102024US01 | 9943375106207.97 | |||
S(25-60)-2 | 2000022228 | W W GRAINGER | US01 | 2024 | RE | 45281 | 21 | 45427 | 5.1E+09 | 9943375106 | 207.97 | USD | 207.97 | USD | MR8M | 4/19/2024 | 51006805052024US01 | 9943375106207.97 | |||
S(25-60)-2 | 2003871958 | CINTAS CORPORATION N | US01 | 2024 | RE | 45307 | 31 | 45530 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 3/16/2024 | 51000625092024US01 | 4180383750209.39 | ||||
S(25-60)-2 | 2000011956 | CINTAS CORPORATION NO 2 | US01 | 2024 | RE | 45307 | 31 | 45506 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 3/16/2024 | 51009005242024US01 | 4180383750209.39 | ||||
S(25-60)-2 | 2000011956 | CINTAS CORPORATION NO 2 | US01 | 2024 | RE | 45307 | 21 | 45507 | 5.1E+09 | 4180383750 | 209.39 | USD | 209.39 | USD | MR8M | 3/16/2024 | 51014410102024US01 | 4180383750209.39 | |||
S(25-60)-2 | 2000011956 | CINTAS CORPORATION NO 2 | US01 | 2024 | RE | 45307 | 31 | 45507 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 2000118010 | 8/4/2024 | 3/16/2024 | 51005250672024US01 | 4180383750209.39 | ||
S(25-60)-2 | 2000011956 | CINTAS CORPORATION NO 2 | US01 | 2024 | RE | 45307 | 21 | 45507 | 5.1E+09 | 4180383750 | 209.39 | USD | 209.39 | USD | 2000118010 | 8/4/2024 | MR8M | 3/16/2024 | 51008885072024US01 | 4180383750209.39 | |
S(25-60)-2 | 2000029014 | CINTAS CORPORATION 100 | US01 | 2024 | RE | 45307 | 31 | 45510 | 5.1E+09 | 4180383750 | -209.39 | USD | -209.39 | USD | 3/16/2024 | 51004805942024US01 | 4180383750209.39 |
Column M is reference and column U is Remove Alphapets
is the result of INDEX(M:U,MATCH(U2,U:U,0),1)) equal to the result of n2?
same reference number have + & - value in column Amountindoccurr then comment is null and same reference column have ++ or -- value in Amountindoccurr then comment is hight priority
it is not clear which column of your table refer to column M:M or U:U in Excel
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
72 | |
67 | |
42 | |
28 | |
20 |