Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
RobertCressey
Frequent Visitor

DataPipeline - Convert UTC to British Summer time

In Fabric Data Factory, the Copy Activity automatically converts dates to UTC behind the scenes. However, my database, located in the UK, faces an issue during British summertime. Dates get 1 hour removed when written to a datalake.

 

The challenge arises because some columns in my database are labelled as UTC and are written in UTC datetime, while others use UK datetime. Unfortunately, none of the columns have a timezone offset within the database, making it difficult to identify them apart from the column headers. As a result, the conversion affects all date fields, causing UTC columns to be 1 hour off, and British times to display in UTC (though they are correctly adjusted for the timezone).

 

Question: Is there a way to add 1 hour back to all datetimes in the database, as its being loaded in the copy activity, so that they match the datetime result from the SQL server once landed in the lakehouse? Alternatively, can I configure the pipeline to use UTC time for both the source and the sink, achieving consistent results?

We’ve explored the Copy Activity’s type conversion settings, but haven’t found a suitable string. Any guidance would be appreciated!

 

 

4 REPLIES 4
v-cboorla-msft
Community Support
Community Support

Hi @RobertCressey 

 

Thanks for using Microsoft Fabric Community

 

Please refer to the following document which might help you. The function essentially converts the timestamp from Coordinated Universal Time (UTC) to the specified destination time zone.

Document Link : convertFromUtc.

 

If the issue still persists, please do let us know. Glad to help.

 

I hope this information helps.

 

Thanks.

Hi please can you provide more detail of where to put the transformation. Please note that I am using the pipeline to loop over 330 tables to upload a full database, there are many many date columns so specifying each one is out of the question. it needs to dynamically work for any table. all I am doing is using a 2 step process, with a look-up to find which tables need updating for that batch, then a copy data activity. 

Hi @RobertCressey 

 

Apologize for the delay in response from my end and the issue that you are facing.

 

The best course of action is to open a support ticket and have our support team take a closer look at it and understand your scenario better. Based on your requirement they can provide a solution.

 

Please reach out to our support team so they can do a more thorough investigation on why this it is happening: Link.

After creating a Support ticket please provide the ticket number as it would help us to track for more information.

 

Hope this helps. Please let us know if you have any other queries.

 

Thank you.

Hi @RobertCressey 

 

We haven’t heard from you on the last response and was just checking back to see if you've had a chance to submit a support ticket. If you have, a reference to the ticket number would be greatly appreciated. This will allow us to track the progress of your request and ensure you receive the most efficient support possible.

 

Thanks.

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

November Update

Fabric Monthly Update - November 2024

Check out the November 2024 Fabric update to learn about new features.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.