Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

OTD_column

OF

phase

Dt_fin

Dt_fin_prev

OTD

178239

10

15/10/2020

 

 

178239

12

 

15/10/2022

1

178239

13

 

16/10/2022

1

151414

14

18/11/2021

18/12/2021

1

151414

15

15/03/2022

15/03/2022

1

151414

16

20/01/2022

18/12/2022

0

 

hi everyone, I am finding difficulties to create the column OTD with power BI

here is the excel code that my client gave me :

 

If date_fin_prev  is null

then OTD = " "

else If dt_fin  is empty

if dt_fin (previous cell) is empty and OF=OF (previous cell) then OTD = OTD (previous cell )

Else

If dt_fin_prev >= today

Then OTD = 1

Else

If  dt_fin = < dt_fin_prev

Then  OTD =  1  else  0

 

thanks for your time.

 

1 ACCEPTED SOLUTION

Hi @Anonymous ,

 

Try this slightly updated code:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
    chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
    addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
    addOtdTemp = Table.Buffer(Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
    else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
    else if [Dt_fin]<=[Dt_fin_prev] then 1 
    else 0)),
    addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and addOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=addOtdTemp[OF]{[Index]-1} 
then addOtdTemp[OTDTemp]{[Index]-1} 
else [OTDTemp] otherwise ""),
    remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
    remCols

 

The last code should have sped up your query a lot, but may not have removed the multi-load requirement. I'm hoping this version will speed it up a bit more and also *should* remove the multi-load requirement. Even if it doesn't, it should only load twice maximum now.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




View solution in original post

13 REPLIES 13
Vijay_A_Verma
Super User
Super User

Try this code (paste in a blank query to test)

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY1BDsAgCAT/wtlEF8XStxj//w3FtBZ72cDOBFojXMr5pkBIKyIkcmJbqAfP2Sqn8I/nj1fHBQXF+hWIUINYF/dymjIj7y9+Pr1qMcHrzYOsj9gH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"OF", Int64.Type}, {"phase", Int64.Type}, {"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "OTDTemp", each if [Dt_fin_prev]=null then ""
    else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
    else if [Dt_fin]<=[Dt_fin_prev] then 1 
    else 0),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "OTD", each try if #"Added Custom"[Dt_fin]{[Index]-1}=null and [OF]=#"Added Custom"[OF]{[Index]-1} then #"Added Custom"[OTDTemp]{[Index]-1} else [OTDTemp] otherwise ""),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Index", "OTDTemp"})
in
    #"Removed Columns"
Anonymous
Not applicable

thanks for your reply,

but there is still one problem when dt_fin is empty we execute this condition 

if dt_fin (previous cell) is empty and OF=OF (previous cell) then OTD = OTD (previous cell )

 

Try this

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY1BDsAgCAT/wtlEF8XStxj//w3FtBZ72cDOBFojXMr5pkBIKyIkcmJbqAfP2Sqn8I/nj1fHBQXF+hWIUINYF/dymjIj7y9+Pr1qMcHrzYOsj9gH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"OF", Int64.Type}, {"phase", Int64.Type}, {"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "OTDTemp", each if [Dt_fin_prev]=null then ""
    else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
    else if [Dt_fin]<=[Dt_fin_prev] then 1 
    else 0),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "OTD", each try if [Dt_fin]=null and #"Added Custom"[Dt_fin]{[Index]-1}=null and [OF]=#"Added Custom"[OF]{[Index]-1} 
then #"Added Custom"[OTDTemp]{[Index]-1} 
else [OTDTemp] otherwise ""),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Index", "OTDTemp"})
in
    #"Removed Columns"
Anonymous
Not applicable

Thank you, 

in power query it works but when you load the data (4 M Rows) it doesn't work it loads more Rows than the actual number of rows, and it doesn't stop 

 

What is the source of your data? 

Anonymous
Not applicable

the source is SQL server data warehouse 

Hi @Anonymous ,

 

Can you try this slightly amended code please?

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
    chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
    addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
    addOtdTemp = Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
    else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
    else if [Dt_fin]<=[Dt_fin_prev] then 1 
    else 0),
    bufOtdTemp = Table.Buffer(addOtdTemp),
    addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and bufOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=bufOtdTemp[OF]{[Index]-1} 
then bufOtdTemp[OTDTemp]{[Index]-1} 
else [OTDTemp] otherwise ""),
    remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
    remCols

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Anonymous
Not applicable

hi @BA_Pete 

it doesn't work 

Hi @Anonymous ,

 

Try this slightly updated code:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
    chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
    addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
    addOtdTemp = Table.Buffer(Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
    else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
    else if [Dt_fin]<=[Dt_fin_prev] then 1 
    else 0)),
    addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and addOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=addOtdTemp[OF]{[Index]-1} 
then addOtdTemp[OTDTemp]{[Index]-1} 
else [OTDTemp] otherwise ""),
    remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
    remCols

 

The last code should have sped up your query a lot, but may not have removed the multi-load requirement. I'm hoping this version will speed it up a bit more and also *should* remove the multi-load requirement. Even if it doesn't, it should only load twice maximum now.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Anonymous
Not applicable

thanks for your reply, 

but there is still the same problem

Hi @Anonymous ,

 

Does your actual query have any merges or appends in it, or any other step that references another query or makes the query reference itself at another step?

Any of these will cause Power Query to reload the table multiple times when applying to the model.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Anonymous
Not applicable

Thank you, but it's none of the above

@Anonymous ,

 

This is sounding less like a PQ issue.

Any chance you can copy your entire M code query from Advanced Editor and paste into a code window ( </> button above) please? I'll be able to see what might be causing this issue, or if it's even a PQ issue at all.

Make sure to remove/X-out your server/db connection string in the Source step.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Helpful resources

Announcements
FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.