Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
smpa01
Super User
Super User

datetime2 shows timestamp

I have ingested data with df gen2 cicd in lakehouse., one of the columns is datetime2 and shows correct value in sql endpoint

 

smpa01_0-1747938404309.png

 

But shows strange value in lakehouse

smpa01_1-1747938472611.png

 

Is there reason for this? How can I have the lakehouse dispay the same datetime2 value as it is displayed by sql endpoint?

@miguel 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION
v-tsaipranay
Community Support
Community Support

Hi @smpa01 ,

Thanks for reaching out to the Microsoft Fabric Community.

 

Since you're using Dataflow Gen2 to ingest data into the Lakehouse via CI/CD and don’t have access to the Spark runtime, options like setting spark.sql.session.timeZone or using Spark-based formatting aren’t available in your case.

 

The difference you're seeing between the Lakehouse preview UI and the SQL Endpoint comes down to how each one handles timestamps. The SQL Endpoint shows full UTC precision (datetime2), while the Lakehouse UI might only show the time portion or adjust it based on your local timezone settings this depends on your client environment.

 

To keep things consistent without relying on Spark, you can use a Derived Column in your Dataflow Gen2 to format the timestamp as a string like this:

formatDateTime([timestamp_column], 'yyyy-MM-dd HH:mm:ss')

This approach ensures the full datetime is preserved during ingestion and avoids any display inconsistencies in the UI.

If you need precise values for validation or downstream processes, querying through the SQL Endpoint is your best.

Lastly, as the Lakehouse UI formatting behavior is by design, feel free to submit or upvote an idea here: https://ideas.fabric.microsoft.com

 

Hope this helps. Please reach out for further assistance.

If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.

 

Thank you.

View solution in original post

5 REPLIES 5
v-tsaipranay
Community Support
Community Support

Hi @smpa01 ,

Thanks for reaching out to the Microsoft Fabric Community.

 

Since you're using Dataflow Gen2 to ingest data into the Lakehouse via CI/CD and don’t have access to the Spark runtime, options like setting spark.sql.session.timeZone or using Spark-based formatting aren’t available in your case.

 

The difference you're seeing between the Lakehouse preview UI and the SQL Endpoint comes down to how each one handles timestamps. The SQL Endpoint shows full UTC precision (datetime2), while the Lakehouse UI might only show the time portion or adjust it based on your local timezone settings this depends on your client environment.

 

To keep things consistent without relying on Spark, you can use a Derived Column in your Dataflow Gen2 to format the timestamp as a string like this:

formatDateTime([timestamp_column], 'yyyy-MM-dd HH:mm:ss')

This approach ensures the full datetime is preserved during ingestion and avoids any display inconsistencies in the UI.

If you need precise values for validation or downstream processes, querying through the SQL Endpoint is your best.

Lastly, as the Lakehouse UI formatting behavior is by design, feel free to submit or upvote an idea here: https://ideas.fabric.microsoft.com

 

Hope this helps. Please reach out for further assistance.

If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.

 

Thank you.

Hi @smpa01  ,

 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

nilendraFabric
Super User
Super User

Hi @smpa01 

 

it looks like The Lakehouse is likely converting the UTC time from the SQL Endpoint to a different timezone (e.g., `IST` or another local timezone).

 

Set the Lakehouse Spark session to use UTC to match the SQL

spark.conf.set("spark.sql.session.timeZone", "UTC")

 

Use explicit formatting to show both date and time with fractional seconds:

 

df.select(
date_format("timestamp_column", "yyyy-MM-dd HH:mm:ss.SSSSSS").alias("formatted_ts")
).show(truncate=False)

 

I am ingesting data in the lakehouse using dataflow gen2 cicd, I have no way to play with spark config

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

I keep on getting the same values in

Lakehouse

smpa01_1-1748011684093.png

vs sql endpoint

smpa01_0-1748011569523.png

 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.