Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
KadenB
Frequent Visitor

Converting UTC to Local Time Without Adding New Columns or Using DAX

Hello all.

We have the following issue:

Our dynamics instance is set to CEST time, and we export the database to one in azure, where time is UTC.

 

(We found this  and this, but it does not solve our problem)

 

We are not happy about doing transformations in DAX which include duplicate columns that will remain in the model.

 

Currently, we have been able to use a custom function to convert my dates to the appropriate timezone, however we have several date columns that need this conversion and the duplication of several of these columns will add complexity to our tables as they are already quite large.

 

Is there any possible way to convert the date/time inplace or in a smarter way?

1 ACCEPTED SOLUTION
Gazzer
Resolver II
Resolver II

It's a little bit 'clunky' but you could simply remove the donor columns after creating the duplicated/converted columns, so you would end up with the same number of columns as you started with. You could also rename the new column to the name of the donor column after removing it, so your column names would remain constant.

View solution in original post

3 REPLIES 3
Gazzer
Resolver II
Resolver II

It's a little bit 'clunky' but you could simply remove the donor columns after creating the duplicated/converted columns, so you would end up with the same number of columns as you started with. You could also rename the new column to the name of the donor column after removing it, so your column names would remain constant.

KadenB
Frequent Visitor

@GazzerThank you for your response, however we are curious would deleting the original date columns at the end of the query still maintains the complexity issues of our tables since the query would still need to calculate all of the duplicate date columns even if they are removed at the end of the query?

The reason we would like to avoid this in general is that we have tables with 2-3 million rows and already 10-15 columns so it takes a long time to refresh our data, and we are similarly worried about the impact the calculation of several custom columns may have on our memory. 

 

Kind Regards, 

 

Kaden 

 

 

 

I'm sorry, but I don't know what kind of memory impact you will experience here - I can only suggest to try it and measure the effect.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors