Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi,
I have following columns are in data table Order, Item, Code, Qty, Country code and Tax Code.
I am trying to summarise the sum of qty each order, item and code, based on the sum of qty, I am trying to get the max qty according to the item, code by using new table.
Also, in the new table I want get the most frequency of country and tax code.
Can you please suggested DAX code-New Table, Measure and Power Query options.
https://www.dropbox.com/s/5w2hrl2zp5s6dvx/Summarise.pbix?dl=0
Thanks to the great efforts by MS engineers to simplify syntax of DAX! Most beginners are SUCCESSFULLY MISLED to think that they could easily master DAX; but it turns out that the intricacy of the most frequently used RANKX() is still way beyond their comprehension! |
DAX is simple, but NOT EASY! |
Hi, Thanks for your reply and help.
Can you please advise how can I get the same result by using new table option (DAX) and power query as well.
I don't opt for PQ to do calculation.
Thanks to the great efforts by MS engineers to simplify syntax of DAX! Most beginners are SUCCESSFULLY MISLED to think that they could easily master DAX; but it turns out that the intricacy of the most frequently used RANKX() is still way beyond their comprehension! |
DAX is simple, but NOT EASY! |
Hi, I created sum of qty column in data table by using the following dax Sum Of Qty = Calculate(sum(data([qty]),
filter(data,
data[item] =earlier(data[item]) & &
data[order] =earlier(data[order])),
allexpect(data, data[item]))
Now I have sum of qty row level and then I created new table by using this DAX code,
Table2 =
summarize(data),
data[item], data[code],
"Max Qty", Max(data[SumOfQty]),
"Country Code" Max(data[contry code]),
"Tax Code", Max(data[Tax Code]))
attached snapshot for your reference.
I got the results (Snapshot table 2)by creating additional columns in Data Table but unable to apply the no blanks conditions in summary table for country code and tax code columns.
I want the same result (Table 2 snapshot) without maintain the additional columns in Data Table also apply non blanks conditions in summary table.
@CNENFRNL, Thanks for your reply. How can I incuding the actual data column instead using measure column in summary table? Please.
Y
How you define most frequency in the desired table?
Proud to be a Super User!
Table =
VAR _tabcountry = TOPN(1,FILTER(SUMMARIZE(data,DATA[Item],DATA[Country Code],"_countcountry",count(DATA[Country Code])),DATA[Country Code]<>BLANK()),[_countcountry])
VAR _tax = TOPN(1,FILTER(SUMMARIZE(data,DATA[Item],DATA[Tax Code],"_counttax",count(DATA[Tax Code])),DATA[Tax Code]<>BLANK()),[_counttax])
VAR _sum = TOPN(1,SUMMARIZE(data,DATA[Order],DATA[Item],"_qty",sum(DATA[Qty])),[_qty])
RETURN NATURALINNERJOIN(NATURALINNERJOIN(_tabcountry,_tax),_sum)
Proud to be a Super User!
@andhiii079845 ,
I apply your code into data table and some reason it return blanks. I don't know were it went wrong. Can you please suggest alternative solution.
Please give your data which generate blank().
Proud to be a Super User!
@andhiii079845, It's not working the the sample data. can you please share your output file.
https://www.dropbox.com/s/5w2hrl2zp5s6dvx/Summarise.pbix?dl=0
I create a new table, insert the code and go to data view:
Proud to be a Super User!
@andhiii079845 ,
Thanks for your reply.
I added coulple of addtional line and it's not working.
Herewith attached file for your reference.
https://www.dropbox.com/s/2wzxhrx9krn2t3z/Summarise%20%283%29.pbix?dl=0
@andhiii079845, Thanks for your reply and help. I will check your formula and update the feedback.
I am sorry, I give up. I try to find a sustianable solution (my first not work I see) but I think after nearly one hour that this problem has to be think bigger? Divide the table in smaller tables and build a data model?
Proud to be a Super User!