The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
hello,
I'm stuck on this issue for a little while. I want to add a column in the dataflow for ETL_upload_datetime, so I created a new columen with formular DateTime.LocalNow() as datetime type, and in the dataflow editor, everything refreshes ok, but when I refresh after closing the editor, it threw me error such as below:
We cannot convert the value #{0} to type #{1}., Underlying error: We cannot convert the value "2025-02-17 03:43:13...." to type DateTime. Details: Reason = Expression.Error;
Can someone help me understand why I'm getting this error and how I can fix it.
Thanks,
Tianyue
Solved! Go to Solution.
Hello @yangty ,
Thank you for your response. As this issue appears to be a potential bug, I suggest raising a Microsoft Fabric support ticket for further investigation.
To submit a support ticket for Fabric and Power BI, please follow the steps outlined in this guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
Hi @yangty ,
We are following up once again regarding your query. Could you please confirm if the issue has been resolved through the support ticket with Microsoft?
If the issue has been resolved, we kindly request you to share the resolution or key insights here to help others in the community. If we don’t hear back, we’ll go ahead and close this thread.
Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.
Thank you for your understanding and participation.
Hello @yangty ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution we provided for your issue worked for you or let us know if you need any further assistance?
Your feedback is important to us, Looking forward to your response.
Thank You.
Hi @yangty ,
We wanted to follow up since we haven't heard back from you. Did the solution we provided help resolve your issue? If not, please let us know if you need any further assistance.
If the issue is resolved, kindly mark it as the accepted solution. This helps others in the community find the answer more easily. Your feedback is important to us, and we're looking forward to your response.
Best regards,
Thank You.
Hello, thank you for following up. I didn' resolve the issue. It feels like a bug in Fabric, as I wasn't doing anything fancy, all I did is to add a datetime column and using the DateTime.LocalNow() formular.
Hello @yangty ,
Thank you for your response. As this issue appears to be a potential bug, I suggest raising a Microsoft Fabric support ticket for further investigation.
To submit a support ticket for Fabric and Power BI, please follow the steps outlined in this guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
Hi @yangty ,
Thanks for contacting the Microsoft Fabric Community. We were able to reproduce your issue. The error occurs because Power Query treats DateTime.LocalNow() differently in the editor versus when refreshing the Dataflow.
Add the ETL_upload_datetime Column:
Formula: DateTime.LocalNow()
2. This will add the current timestamp as a DateTime value.
However, if this doesn’t work as expected after refreshing, try changing the formula to.
Formula: DateTime.From(DateTime.LocalNow())
Both formulas should work. Could you please try these and let us know if you need any further assistance? We are ready to help.
If my response solved your query, please mark it as the Accepted solution to help others find it easily.
And if my answer was helpful, I'd really appreciate a 'Kudo'.
Thank you for responding to my question, I have tried both formula but received the same error message when trying to refresh the dataflow. There is one thing I noticted that you don't have the 7 digit second precision in your screenshot, but I have the date time returns as "2025-02-18 00:43:29.9836282". Any thougths on that? could that be the cause? Thanks!
Hi @yangty ,
Thank you for your feedback. We have not identified any issues caused by this precision. I successfully loaded the data into a test table on my Lakehouse.
M Code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUUpJLE5RitWJVjICchJTihMhPGMgLz0jPS0DzDMB8orTUoqLwDxTIC87KwsorRQbCwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
#"Changed column type" = Table.TransformColumnTypes(Source, {{"Column1", Int64.Type}, {"Column2", type text}}),
#"Added custom" = Table.AddColumn(#"Changed column type", "Datetime", each DateTime.LocalNow()),
#"Changed column type 1" = Table.TransformColumnTypes(#"Added custom", {{"Datetime", type datetime}})
in
#"Changed column type 1"
Note: Try adding this last step #changed Column type1
(Table.TransformColumnTypes(laststep, {{"Datetime", type datetime}}))
If my response solved your query, please mark it as the Accepted solution to help others find it easily.
And if my answer was helpful, I'd really appreciate a 'Kudos'.
Thank you for your response, the first formula was the one I used before and didn't work, I tried the second mehtod, but still get the same error message. Is it possible that I'm trying to load the data to a datawarehouse table, in the DW table I used datetime data type.
User | Count |
---|---|
17 | |
14 | |
8 | |
7 | |
5 |
User | Count |
---|---|
31 | |
31 | |
20 | |
15 | |
11 |