Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
Don-Bot
Resolver I
Resolver I

Modify Timezone for apps own data power bi embedded model

I have a apps own data power bi embedded semantic model setup.  The model itself is an import model that is setup as a star schema.  Multiple clients and multiple users across different timezones utilize the app which has RLS by client id.  

 

I'm attempting to set it up so that it will automatically modify the users timezone by the local browser of the user that logs in.  However, I am stuck on the first step of simply adjusting the timezones.  

 

Below is a screenshot of the model star schema for the fact table and it's date dimension.   These are joined on a datetimeid that is datetime turned into an integer.  Example data is at the bottom.  

 

DonBot_1-1734366359112.png

 

 

I have created this measure that will allow me to create a datetime field on the ticket table that will adjust timezones based off of another temp table that has timezones available for me to test.  This works if it is in the fact table and doesn't touch the date dimension.

 

Create DateTime Local = 
VAR DateValue = max(Ticket[Create_DateTime_ID])
VAR Year = LEFT(DateValue, 4)
VAR Month = MID(DateValue, 5, 2)
VAR Day = MID(DateValue, 7, 2)
VAR Hour = MID(DateValue, 9, 2)
VAR DateTimeValue = DATE(Year, Month, Day) + TIME(Hour, 0, 0)
VAR AdjustedDateTimeValue = DateTimeValue + 'Time Zones'[timezone offset] / 24
RETURN AdjustedDateTimeValue

 

 I also assume I can clean the above measure up and streamline it but haven't worked on that yet. 

However, if I attempt to do this in the createdatetime dimension So that I can adjust timezones for the entire dimension I get resource errors when I try to pull in a simple datetime adjustment into my table with the ID field from my fact table.

 

 resource error with this

 

AdjustedDateTime = 
'Create DateTime'[Creation DateTime] + [SelectedTimezoneOffset] / 24

 

 

Is there a way to adjust the timezone in the dimension table so that it can be used with the fact table?  We have a large amount of tickets and some clients are rather huge, but I have gotten this to error out when I filtered down to just a few.   

 

Although in my small data example both measures work fine.  It appears to get the resource error with larger amounts of data.  

 

Example data:  

 

Ticket

Ticket IDdatetime_idCreation DateTime
A20200320153/20/2020 15:00
B20200414204/14/2020 20:00
C20200619146/19/2020 14:00
D202010060910/6/2020 9:00

Date Dimension

datetime_idCreation DateTime
20200320153/20/2020 15:00
20200414204/14/2020 20:00
20200619146/19/2020 14:00
202010060910/6/2020 9:00
11 REPLIES 11
SacheeTh
Advocate III
Advocate III

Hi @Don-Bot ,
As @lbendlin said  standard and efficient approach to handle time zones in Power BI, especially in multi-user, multi-time-zone scenarios, is to store all datetime values in UTC and let the report user's browser handle the time zone conversion. Here's how you can implement this approach:tandard and efficient approach to handle time zones in Power BI, especially in multi-user, multi-time-zone scenarios, is to store all datetime values in UTC and let the report user's browser handle the time zone conversion. Here's how you can implement this approach:

On the power bi service some times following adjusetment may help ypu, Handling time zone adjustments for a large dataset in Power BI, especially with a star schema and fact-dimension relationships, can be resource-intensive. Here's a streamlined approach to resolve your issue.


Approach 1: Adjust Timezone in the Fact Table (Preferred for Scalability)

Instead of modifying the date dimension, you can adjust the time zone directly in the fact table or at the reporting layer. This avoids overloading the dimension table with large adjustments.

  1. Enhanced Measure for Time Zone Adjustment: Simplify and optimize your measure to handle large datasets:

    CreateDateTimeLocal =
    VAR DateValue = MAX(Ticket[datetime_id])
    VAR Year = LEFT(DateValue, 4)
    VAR Month = MID(DateValue, 5, 2)
    VAR Day = MID(DateValue, 7, 2)
    VAR Hour = MID(DateValue, 9, 2)
    VAR BaseDateTime = DATE(Year, Month, Day) + TIME(Hour, 0, 0)
    VAR AdjustedDateTime = BaseDateTime + [SelectedTimezoneOffset] / 24
    RETURN AdjustedDateTime
    • SelectedTimezoneOffset can come from a user-provided parameter or RLS-based user profile.
  2. Leverage Parameters for Time Zone Offsets:

    • Create a parameter or lookup table for time zones (TimeZoneOffsets).
    • Map each client or user to their appropriate time zone.
  3. Fact Table Adjustment:

    • Keep the adjustment in the fact table, which avoids issues with dimension table scalability.
    • Use this measure directly in your visualizations for context-aware calculations.

Approach 2: Adjust Time Zone in the Date Dimension (Less Recommended for High Scale)

If adjusting the Date Dimension is essential (e.g., for consistent reporting), follow these guidelines:

  1. Pre-calculate Time Zone Adjustments:

    • If feasible, adjust the time zones during ETL or pre-load processing rather than in DAX.
    • Add a column for each time zone offset in the Date Dimension (e.g., UTCDateTime, ESTDateTime, etc.).
  2. Optimized DAX for Adjusted DateTime: If DAX adjustment is unavoidable, improve the measure:

    AdjustedDateTime =
    'Create DateTime'[Creation DateTime] + [SelectedTimezoneOffset] / 24
    • Use relationships and aggregations carefully to avoid high cardinality joins.
  3. Model-Specific Considerations:

    • Ensure datetime_id remains unique in the dimension.
    • Limit DAX evaluation scope by filtering down the dataset using CALCULATE or FILTER.

Best Practices to Handle Large Data:

  1. Aggregate Data Where Possible:

    • Summarize the fact table by aggregating time-based metrics (e.g., daily ticket counts) to reduce row counts.
  2. Use Direct Query or Incremental Refresh:

    • If data size is a concern, configure DirectQuery for the fact table or enable incremental refresh.
  3. Leverage Dataflows for Preprocessing:

    • Use Power BI Dataflows or upstream ETL tools (e.g., Azure Data Factory) to precompute time zone adjustments.

Testing and Validation:

  • Apply filters in your reports (e.g., ClientID or UserID) to validate the measure against different time zones.
  • Use small datasets for testing and gradually scale up.

If you're still encountering resource errors, focus on optimizing relationships, data cardinality, and aggregation logic. Let me know if you need further assistance!



lbendlin
Super User
Super User

The standard approach would be to use UTC datetime values and let the report user's browser do the translation to the current local timezone.  Is that not an option for you?

Hi @lbendlin , I am seeing the same issue.  Do you have any examples or documentation on this?  I've even reached out to MIcrosoft and they mentioned it doesn't work the way you stated above.  Please let me know if you have some more on this.

@lbendlin  as you see from the continuing chat with @SacheeTh  and the steps I've taken that doesn't appear to work.  Is there something in here I'm missing that you can provide further insight on?  Since Power BI doesn't seem to automatically change timezones as you have mentioned.

Do you have any examples of this working?  I see you've made this same post in other time zone discussions yet with no examples and nothing that would work.  Just al ink to RADACAD which doesn't answer this question.

You're absolutely right! The standard and efficient approach to handle time zones in Power BI, especially in multi-user, multi-time-zone scenarios, is to store all datetime values in UTC and let the report user's browser handle the time zone conversion.

@Don-Bot, Here's how you can implement this approach:


Steps to Use UTC Datetime with Browser-Based Time Zone Conversion

  1. Store All Datetime Values in UTC:

    • Ensure your data source stores all datetime values in UTC format.
    • If your data isn't already in UTC, convert it during ETL using your database engine, ETL tool, or Power BI Dataflows.
  2. Enable Browser Time Zone Conversion:

    • Power BI automatically converts datetime fields from UTC to the user's local timezone in the report visuals based on the user's browser settings.

    Note: This conversion happens automatically for datetime columns in reports. However, calculated columns and measures often do not get this treatment unless explicitly configured.

  3. Avoid Manual Adjustments in the Model:

    • Do not apply manual time zone adjustments in DAX or your model. Keep datetime fields in UTC to avoid duplicating time zone logic.
  4. Handle Fixed Offsets (Optional):

    • For reports requiring time zone offsets (e.g., client-specific reports with fixed time zones), you can create a parameterized time zone offset or a Time Zone Dimension Table.

Example Implementation

Adjust ETL to Store UTC

  • If your source data contains local datetime values, convert them to UTC during the data load:
    • SQL Server:

      SELECT CONVERT(datetime, SWITCHOFFSET(CAST(LocalDateTime AS datetimeoffset), DATENAME(TzOffset, SYSDATETIMEOFFSET())))
      FROM YourTable;
    • Power Query: Use DateTimeZone.ToUtc([LocalDateTime]).


Use Native Browser Conversion

Once datetime values are stored in UTC, ensure the column type in Power BI is set to DateTime. For example:

  • Table: Ticket
  • Column: Creation DateTime (stored in UTC)

Add this column directly to your visuals, and Power BI will automatically translate it to the local timezone based on the user's browser.


Example for Fixed Offset (If Needed)

If specific users or clients require reports in a fixed time zone:

  1. Create a Time Zone Offset Table:

    • Example:TimeZone Offset (Hours)
      UTC0
      EST-5
      PST-8
  2. Add a Calculated Column or Measure:

    AdjustedDateTime =
    Ticket[Creation DateTime] + SELECTEDVALUE(TimeZoneOffsets[Offset]) / 24
  3. Use the Adjusted Datetime in Visuals:

    • This is necessary only if you must display fixed time zones, not browser-based local time zones.

Advantages of This Approach:

  1. Simplifies the Model:

    • No need to store multiple adjusted datetime values or calculate them dynamically.
    • No resource strain from real-time DAX adjustments.
  2. Automatic Time Zone Detection:

    • End-users see times based on their browser settings without extra configuration.
  3. Scalability:

    • Suitable for large datasets and multi-tenant scenarios.

Let me know if you'd like detailed guidance on any of these steps!

Hi @SacheeTh ,
Can you send the link to where you got these screenshots?

DonBot_0-1734715637050.png

 

Hi @SacheeTh  and @lbendlin ,

THank you for the very detailed response.  It has been extremely helpful but I believe I am stuck.  In your response you mention that I need to enable my browser's timezone Conversion.  

 

I have taken the following steps with my import model.  

I convert my EST value to UTC:

 

 

 

DateTimeZone.SwitchZone(DateTimeZone.From([#"Creation DateTime - Timezone"]), 0)

 

 

 

 

I then create a new column and have it converted to local time

 

 

 

DateTimeZone.ToLocal(DateTimeZone.From([UTC]))

 

 

 

 

I upload that to the service and check.  It looks like it converts the local back to EST just fine.  So I change the timezone on my laptop to PST and try again.  The Timezone stays as EST in the service.  I log out of the service close the browser and come back in.  Still EST.  Am I missing something?  I even logged in via incognito on chrome and still had EST for Local.  Let me know what I missed...

Below is what I'm seeing.  My laptop's timezone is set to PST.  

My Testing Table (Import).

 

 

let
    Source = Sql.Database(Server, Database, [Query="SELECT  [Company_ID]#(lf)      ,[Ticket_ID]#(lf)      ,[Create_DateTime]#(lf)#(tab)  , ([Create_DateTime] AT TIME ZONE 'Eastern Standard Time') AT TIME ZONE 'UTC' AS UTCDateTime#(lf)  FROM  [Analysis].[FACT]#(lf)#(lf)", MultiSubnetFailover=true]),
    #"Duplicated Column" = Table.DuplicateColumn(Source, "UTCDateTime", "UTCDateTime - Copy"),
    #"Changed Type" = Table.TransformColumnTypes(#"Duplicated Column",{{"UTCDateTime - Copy", type datetimezone}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "UTC_Local", each DateTimeZone.ToLocal(DateTimeZone.From([#"UTCDateTime - Copy"]))),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "EST", each DateTimeZone.SwitchZone([UTCDateTime], -4)),
    #"Added Custom2" = Table.AddColumn(#"Added Custom1", "PST", each DateTimeZone.SwitchZone([UTCDateTime], -7))
in
    #"Added Custom2"

 

 

 

What I'm seeing (I expect UTC_Local to be PST not EST).

DonBot_0-1734461956990.png

 

Hi @Don-Bot ,

If you need the UTC_Local column to reflect the local time zone of the device or a specific time zone, here are your options:

Option 1: Explicit Time Zone Conversion

Instead of relying on DateTimeZone.ToLocal, explicitly set the desired time zone conversion using DateTimeZone.SwitchZone. For example:

 

#"Added Custom2" = Table.AddColumn(#"Added Custom1", "PST", each DateTimeZone.SwitchZone([UTCDateTime], -7))

 

 

 

Added the (-7) for switch to PST, This ensures that the PST conversion is consistent regardless of where the report is run.

// I often use this method to get the Local refresh time on service

SacheeTh_0-1734504672417.png

 

 Else you can parameterze the number as follow, 

 

Option 2: Parameterize Time Zone Offsets

Create a parameter for the desired time zone offset and use it in DateTimeZone.SwitchZone. This allows you to dynamically adjust the time zone without modifying the code.

Steps:

  1. Create a parameter in Power Query for the offset (e.g., TimeZoneOffset).
  2. Use the parameter in your SwitchZone formula:
     
     

 

#"Added Custom3" = Table.AddColumn(#"Added Custom2", "Dynamic_Local", each DateTimeZone.SwitchZone([UTCDateTime], TimeZoneOffset))

 

Thank you again @SacheeTh ,
Let me know if I'm understanding you correctly.  But from what I see is the only way to setup different timezones in my model to pre-configure them with extra columns such as "CST, PST, EST"?  

 

I've been able to get the parameter stuff to work but the problem is one model can have users across different timezones.  So I can't hard code a parameter into it.  The closest thing I could probably do is add the columns for each timezone.  Since it appears it's not auto detecting or changing timezones with the methods I've tried above.

Hi @Don-Bot ,
Yes, your understanding is correct. Unfortunately & as far as I know, Power BI does not natively auto-detect or dynamically adjust time zones for individual users in the Power BI Service on the report end. If your model has users across multiple time zones, pre-configuring time zones with extra columns (e.g., "CST", "PST", "EST") is one of the most straightforward approaches.

That's the only this that pop to me right now, other than user wise RLS or OLS.

Implement Row-Level Security (RLS)

If each user's time zone can be determined from their identity (e.g., region in a database), you can implement Row-Level Security to filter or adjust time zones dynamically based on user roles.

 

 

Ah on the 2nd through, you can set up a DAX for that too, I know you also have came a cross on this for sure. Anyway I'll add the steps here. 😄

Use DAX for Dynamic Calculations

If you need user-specific flexibility, consider calculating offsets in DAX. While this won't detect the user's system time zone, you can provide a slicer or control for users to select their time zone.

  1. Create a Time Zone Table:
    • A simple table mapping time zones to that has all the offsets:
       
      Time Zone| Offset
      EST-5
      CST-6
      MST-7
      PST-8
  2. Create a Slicer:
    • Add a slicer to the report for users to select their time zone.
  3. Create a DAX Measure:
    • Use the selected offset to calculate the adjusted time:

 

AdjustedTime = 
 VAR SelectedOffset = SELECTEDVALUE(TimeZone[Offset], 0) 

RETURN 
 UTCDateTime + (SelectedOffset / 24)

 

  • Display this adjusted time in visuals.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.