Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I've set up a data pipeline that copies data from our oracle database to a Fabric lakehouse. However, when using .parquet I can see times being changed (likely due to daylight savings), f.e. 2024-03-29 00:00 to 2024-03-28 22:00
When I use a .json extension, the time stays the same as in my source.
How can I tell my pipeline to not change the time when using .parquet as a sink?
As I'm copying all tables dynamically, there is no way for me to set up mapping.
Solved! Go to Solution.
Hi,
I think I figured out the problem. My source system doesn't have any timezone specification in their datetime fields. I've read that parquet self-assigns a timezone to the value if the source doesn't give any.
Since the Fabric capacity is probably in another timezone, it changes the default timezone of my .parquet files.
I've had to set the correct timezone in my spark notebook to correct this. More info on:
Hi @Noeleke1301
Thanks for using Microsoft Fabric Community.
At this time, we are reaching out to the internal team to get some help on this.
We will update you once we hear back from them.
Appreciate your patience.
Thanks
Hi,
I think I figured out the problem. My source system doesn't have any timezone specification in their datetime fields. I've read that parquet self-assigns a timezone to the value if the source doesn't give any.
Since the Fabric capacity is probably in another timezone, it changes the default timezone of my .parquet files.
I've had to set the correct timezone in my spark notebook to correct this. More info on:
Hi @Noeleke1301
Glad that you were able to find some insights and thank you for sharing the same with the community as it can be helpful to others.
Please continue using Fabric Community for further queries.
Thanks
User | Count |
---|---|
7 | |
3 | |
2 | |
2 | |
1 |
User | Count |
---|---|
10 | |
9 | |
5 | |
3 | |
3 |