Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
OF | phase | Dt_fin | Dt_fin_prev | OTD |
178239 | 10 | 15/10/2020 |
|
|
178239 | 12 |
| 15/10/2022 | 1 |
178239 | 13 |
| 16/10/2022 | 1 |
151414 | 14 | 18/11/2021 | 18/12/2021 | 1 |
151414 | 15 | 15/03/2022 | 15/03/2022 | 1 |
151414 | 16 | 20/01/2022 | 18/12/2022 | 0 |
hi everyone, I am finding difficulties to create the column OTD with power BI
here is the excel code that my client gave me :
If date_fin_prev is null
then OTD = " "
else If dt_fin is empty
if dt_fin (previous cell) is empty and OF=OF (previous cell) then OTD = OTD (previous cell )
Else
If dt_fin_prev >= today
Then OTD = 1
Else
If dt_fin = < dt_fin_prev
Then OTD = 1 else 0
thanks for your time.
Solved! Go to Solution.
Hi @Anonymous ,
Try this slightly updated code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
addOtdTemp = Table.Buffer(Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
else if [Dt_fin]<=[Dt_fin_prev] then 1
else 0)),
addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and addOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=addOtdTemp[OF]{[Index]-1}
then addOtdTemp[OTDTemp]{[Index]-1}
else [OTDTemp] otherwise ""),
remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
remCols
The last code should have sped up your query a lot, but may not have removed the multi-load requirement. I'm hoping this version will speed it up a bit more and also *should* remove the multi-load requirement. Even if it doesn't, it should only load twice maximum now.
Pete
Proud to be a Datanaut!
Try this code (paste in a blank query to test)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY1BDsAgCAT/wtlEF8XStxj//w3FtBZ72cDOBFojXMr5pkBIKyIkcmJbqAfP2Sqn8I/nj1fHBQXF+hWIUINYF/dymjIj7y9+Pr1qMcHrzYOsj9gH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"OF", Int64.Type}, {"phase", Int64.Type}, {"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "OTDTemp", each if [Dt_fin_prev]=null then ""
else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
else if [Dt_fin]<=[Dt_fin_prev] then 1
else 0),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "OTD", each try if #"Added Custom"[Dt_fin]{[Index]-1}=null and [OF]=#"Added Custom"[OF]{[Index]-1} then #"Added Custom"[OTDTemp]{[Index]-1} else [OTDTemp] otherwise ""),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Index", "OTDTemp"})
in
#"Removed Columns"
thanks for your reply,
but there is still one problem when dt_fin is empty we execute this condition
if dt_fin (previous cell) is empty and OF=OF (previous cell) then OTD = OTD (previous cell )
Try this
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY1BDsAgCAT/wtlEF8XStxj//w3FtBZ72cDOBFojXMr5pkBIKyIkcmJbqAfP2Sqn8I/nj1fHBQXF+hWIUINYF/dymjIj7y9+Pr1qMcHrzYOsj9gH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"OF", Int64.Type}, {"phase", Int64.Type}, {"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "OTDTemp", each if [Dt_fin_prev]=null then ""
else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
else if [Dt_fin]<=[Dt_fin_prev] then 1
else 0),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "OTD", each try if [Dt_fin]=null and #"Added Custom"[Dt_fin]{[Index]-1}=null and [OF]=#"Added Custom"[OF]{[Index]-1}
then #"Added Custom"[OTDTemp]{[Index]-1}
else [OTDTemp] otherwise ""),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Index", "OTDTemp"})
in
#"Removed Columns"
Thank you,
in power query it works but when you load the data (4 M Rows) it doesn't work it loads more Rows than the actual number of rows, and it doesn't stop
What is the source of your data?
the source is SQL server data warehouse
Hi @Anonymous ,
Can you try this slightly amended code please?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
addOtdTemp = Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
else if [Dt_fin]<=[Dt_fin_prev] then 1
else 0),
bufOtdTemp = Table.Buffer(addOtdTemp),
addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and bufOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=bufOtdTemp[OF]{[Index]-1}
then bufOtdTemp[OTDTemp]{[Index]-1}
else [OTDTemp] otherwise ""),
remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
remCols
Pete
Proud to be a Datanaut!
Hi @Anonymous ,
Try this slightly updated code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("XY3LDcAgDEN3yRmJ2HxKZ0Hdf42SAhLpxZLznpzeBVdjuiUI1KJEaKTSijzh5LTTofDH0+LV84KMbPcvWgQMYhXO4s0y32iaM654s46gRsU2uTeH+bw=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [OF = _t, phase = _t, Dt_fin = _t, Dt_fin_prev = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"Dt_fin", type date}, {"Dt_fin_prev", type date}}),
addIndex = Table.AddIndexColumn(chgTypes, "Index", 0, 1, Int64.Type),
addOtdTemp = Table.Buffer(Table.AddColumn(addIndex, "OTDTemp", each if [Dt_fin_prev]=null then ""
else if [Dt_fin_prev]>=Date.From(DateTime.LocalNow()) then 1
else if [Dt_fin]<=[Dt_fin_prev] then 1
else 0)),
addOtd = Table.AddColumn(addOtdTemp, "OTD", each try if [Dt_fin]=null and addOtdTemp[Dt_fin]{[Index]-1}=null and [OF]=addOtdTemp[OF]{[Index]-1}
then addOtdTemp[OTDTemp]{[Index]-1}
else [OTDTemp] otherwise ""),
remCols = Table.RemoveColumns(addOtd,{"Index", "OTDTemp"})
in
remCols
The last code should have sped up your query a lot, but may not have removed the multi-load requirement. I'm hoping this version will speed it up a bit more and also *should* remove the multi-load requirement. Even if it doesn't, it should only load twice maximum now.
Pete
Proud to be a Datanaut!
thanks for your reply,
but there is still the same problem
Hi @Anonymous ,
Does your actual query have any merges or appends in it, or any other step that references another query or makes the query reference itself at another step?
Any of these will cause Power Query to reload the table multiple times when applying to the model.
Pete
Proud to be a Datanaut!
Thank you, but it's none of the above
@Anonymous ,
This is sounding less like a PQ issue.
Any chance you can copy your entire M code query from Advanced Editor and paste into a code window ( </> button above) please? I'll be able to see what might be causing this issue, or if it's even a PQ issue at all.
Make sure to remove/X-out your server/db connection string in the Source step.
Pete
Proud to be a Datanaut!
| User | Count |
|---|---|
| 8 | |
| 7 | |
| 5 | |
| 4 | |
| 3 |