Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

#time converting from function to reference in Dataflow

Using the function #time() in a dataflow seems to result in the mashup engine 'correcting' it to

 

#"#time"()

 

This seems like it is potentially a new issue, as Dataflows using this function did not have the same issue prior to today. Haven't tested in desktop or dataset refreshes.

 

Edit: after a bit more testing, this also seems to be occurring with #date and #datetime for me.

Edit2: It seems that using the following syntax resolves the issue

 

#time(0,0,0) as time

 

 

Status: New
Comments
Gethrj
Regular Visitor

Many thanks for uploading this, particularly for adding the solution. Exact same issue (using #date) brought down the dataflow of our golden dataset, starting from around 6AM this morning. Your fix worked for us too and saved a lot of headache!

v-lili6-msft
Community Support

hi  @Anonymous 

Thank you for sharing the solution, not sure if it is a new issue, if possible, could you share complete steps to reproduce the problem? or do it in a power bi desktop and share it with us.

 

Regards,

Lin

Anonymous
Not applicable

Hi @v-lili6-msft 

 

The following query in a DataFlow should replicate the error:

 

 

let
  Source = Table.FromRecords({
    [Time = "01/01/2020 08:00:30"],
    [Time = "01/02/2020 18:30:15"]
}),
  #"Changed column type" = Table.TransformColumnTypes(Source, {{"Time", type datetime}}),
  #"Added custom" = Table.AddColumn(#"Changed column type", "TestTime1", each #time(18, 22, 15), type time),
  #"Added custom 1" = Table.AddColumn(#"Added custom", "TestTime2", each #time(Time.Hour([Time]), Time.Minute([Time]), 0), type time),
  #"Added custom 2" = Table.AddColumn(#"Added custom 1", "TestDate1", each #date(2020,1,1), type date),
  #"Added custom 3" = Table.AddColumn(#"Added custom 2", "TestDate2", each #date(Date.Year([Time]), Date.Month([Time]), 1), type date)
in
  #"Added custom 3"

 

 

 

What you'll see from this, is once the query is applied, all the # functions get the additional tags around them, ie #date() becomes #"#date"().

 

The following syntax will resolve the issue:

 

 

let
  Source = Table.FromRecords({
    [Time = "01/01/2020 08:00:30"],
    [Time = "01/02/2020 18:30:15"]
}),
  #"Changed column type" = Table.TransformColumnTypes(Source, {{"Time", type datetime}}),
  #"Added custom" = Table.AddColumn(#"Changed column type", "TestTime1", each #time(18, 22, 15) as time, type time),
  #"Added custom 1" = Table.AddColumn(#"Added custom", "TestTime2", each #time(Time.Hour([Time]), Time.Minute([Time]), 0) as time, type time),
  #"Added custom 2" = Table.AddColumn(#"Added custom 1", "TestDate1", each #date(2020,1,1) as date, type date),
  #"Added custom 3" = Table.AddColumn(#"Added custom 2", "TestDate2", each #date(Date.Year([Time]), Date.Month([Time]), 1) as date, type date)
in
  #"Added custom 3"

 

 

 

@Gethrj Glad it could help. I've found a few more of our dataflows are affected this morning which implies it is a new issue.

 

ncbshiva
Advocate V

Hi All,

 

I am experiencing the same issue with #date. It is working fine in Power BI desktop.
But in dataflows it is automatically converted to #"#date"(). Even if we change it still throws the error.
image.png

 



Henrik2
Regular Visitor

I have the same issue and the workaround #date(Y,M,D) as date works for me.

 

Thanks RMG_C29!

AnAnalyst
Helper III

Same issue here. #date(year,m,d) as date seems to fix it. 

v-lili6-msft
Community Support

hi all

I have reported this issue internally, ICM: 223040645

will update here once I get any information.

 

Regards,

Lin

v-lili6-msft
Community Support

hi  all

this is a known issue, and i get this from PG:

The commit will be included in this week’s train, but the train takes a couple of weeks to roll out to all prod regions. I’d say it’ll reach all prod regions by the end of next week.

 

 

Regards,

Lin