Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Arnault_
Resolver III
Resolver III

Convert DAX to M

Hi All,

 

I am trying to move all my calculated colums to the query editor which requires to translate DAX statement to M. I am not sure it is always possible. Here is an example where I can't make it work and your help would be much appreciate.

 

I have one order (order_id) that can be prepared in several sequences (packing_id) with different dates (TS_packing_date). I would like to evaluate the different dates and determine if it corresponds to the first packing. However, I accept a tolerance of +8 days.

 

The below formula looks for the earliest date (TS05_packing_date) of a specified order reference (order_id) and then compare the packing date (TS05_packing_date) with this earliest date and if [packing date - earliest date] is <=8 then "yes" else "no"

 

Here is my formula in DAX (it works fine)

 

 

first_packing =
IF (
    'FACT Order'[TS05_packing_date]
        <= CALCULATE (
            MIN ( 'FACT Order'[TS05_packing_date] ) + 8;
            ALLEXCEPT ( 'FACT Order'; 'FACT Order'[order_id] )
        );
    "YES";
    "NO"
)

 

 

 

and here is the expected result

 

order_idpacking_idTS05_packing_datefirst_packing
O001P00101-janvyes
O001P00202-janvyes
O001P00303-janvyes
O001P00412-févrno
O001P00515-marsno
O002P00615-marsyes
O002P00725-marsno

 

 How could I translate the iteration in M language? How could identidy the earliest date of a particular order reference?

Thanks in advance for your help.

1 ACCEPTED SOLUTION
TomMartens
Super User
Super User

Hey @Arnault_ ,

first I had to convert the values from the column TS05_packing_date into something that my Power Query was able to recognize as a date, this means my table looks like this:

image.png

Here is my solution that is based on two steps:

1. Determinig the min date per order_id

For this I grouped the table by order_id, but also added the aggregation function "All Rows"

image.png

After expanding the group my table looks like this:

image.png

2. Adding a custom column to check the date

I added a custom column "checkFirstDate" using this M expression:

if [TS05_packing_date] <= Date.AddDays([MinDate], 😎 then "yes" else "no"

Finally the table looks like this, of course the the column "MinDate" can be deleted from the table as an additional step:

image.png

 

Hopefully this provides you with some ideas, how to tackle your requirement.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

5 REPLIES 5
TomMartens
Super User
Super User

Hey @Arnault_ ,

first I had to convert the values from the column TS05_packing_date into something that my Power Query was able to recognize as a date, this means my table looks like this:

image.png

Here is my solution that is based on two steps:

1. Determinig the min date per order_id

For this I grouped the table by order_id, but also added the aggregation function "All Rows"

image.png

After expanding the group my table looks like this:

image.png

2. Adding a custom column to check the date

I added a custom column "checkFirstDate" using this M expression:

if [TS05_packing_date] <= Date.AddDays([MinDate], 😎 then "yes" else "no"

Finally the table looks like this, of course the the column "MinDate" can be deleted from the table as an additional step:

image.png

 

Hopefully this provides you with some ideas, how to tackle your requirement.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Hello. Can anyone help me convert the following dax code to M query? I have a dataset where "value" is cumulative for a year i.e. YTD. I need monthly value so that I can append this dataset to another table with the same variables where value is "monthly" and NOT YTD. Thank you

Month_value_calc =
VAR PREVValue =
CALCULATE(
MAX(Consol_All_Yrs_Segment_Sector_Country[YTD_ZAR_Value]),
Consol_All_Yrs_Segment_Sector_Country[Country] = EARLIER(Consol_All_Yrs_Segment_Sector_Country[Country]),
Consol_All_Yrs_Segment_Sector_Country[Financial metric] = EARLIER(Consol_All_Yrs_Segment_Sector_Country[Financial metric]),
Consol_All_Yrs_Segment_Sector_Country[Segment_sector] = EARLIER(Consol_All_Yrs_Segment_Sector_Country[Segment_sector]),
Consol_All_Yrs_Segment_Sector_Country[Year] = EARLIER(Consol_All_Yrs_Segment_Sector_Country[Year]),
Consol_All_Yrs_Segment_Sector_Country[Month] = EARLIER(Consol_All_Yrs_Segment_Sector_Country[Month])-1,
REMOVEFILTERS(Consol_All_Yrs_Segment_Sector_Country)
)
RETURN
IF(MONTH(Consol_All_Yrs_Segment_Sector_Country[Period]) = 1, Consol_All_Yrs_Segment_Sector_Country[YTD_ZAR_Value],
Consol_All_Yrs_Segment_Sector_Country[YTD_ZAR_Value] - PREVValue)

Hi @TomMartens ,

Thank you very much for the explanations and the solution you suggested. It works.

Now I am wondering if it is better than my previous option where it was done in a calculated column. Indeed, the example I shared was a very simplified dataset. Doing the same (group by then expand) with a larger dataset takes much longer. I am not sure it is better in terms of performance. What would you suggest? Keeping it in a calculated column or doing in the query editor?

Thank you again for your great support.

Hey @Arnault_ ,

 

the answer to your question "is it better than your DAX solution" (slightly rephrased), is simply this: it depends 🙂

To take a decision, it's absolutely necessary to read this article: https://www.sqlbi.com/articles/comparing-dax-calculated-columns-with-power-query-computed-columns/

 

As there are just 2 unique value "yes" and "no", all the downsides of DAX based columns can be neglected.

 

On the other hand, here is some M code that does not use the expand columns, instead it uses the M function Table.Join inline, maybe you want to test it with your larger table:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wyi9KSS2Kz0xR0lEqSEzOzsxLh3BCgg1M42EiKYklqUCxtMyi4hKYoFKsTrSSv4GBIVAiAEIZGRha6hoY6oI5lanFaEqMkJUYYVVijKzEGKsSE7gSI11DSyAnLx9NhSlchbGuoSmKCiOICjN0FUjWQJWYIykxghkSCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"(blank)" = _t, #"(blank).1" = _t, #"(blank).2" = _t, #"(blank).3" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"(blank)", type text}, {"(blank).1", type text}, {"(blank).2", type text}, {"(blank).3", type text}}),
    #"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
    #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"order_id", type text}, {"packing_id", type text}, {"TS05_packing_date", type date}, {"first_packing", type text}}),


    #"Grouped Rows" = Table.Group(#"Changed Type1", {"order_id"}, {{"MinDate", each List.Min([TS05_packing_date]), type date}}),
    TableJoin = Table.Join(#"Changed Type1" , "order_id" , #"Grouped Rows" , "order_id"  ),
    
    #"Added Custom" = Table.AddColumn(TableJoin, "checkFirstDate", each if [TS05_packing_date] <= Date.AddDays([MinDate], 😎 then "yes" else "no"),
    #"Changed Type2" = Table.TransformColumnTypes(#"Added Custom",{{"TS05_packing_date", Int64.Type}})

in
    #"Changed Type2"

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Hi @TomMartens ?

I really appreciate the time you dedicate to mly request and thank you for the proposed solution.

Cheers

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.