Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hi all,
I have been looking into datetime conversion solution in Power BI Report for both direct query and import mode storage.
I'm Embedding these report into my web application with Embed for Customer Solution.
my current method-[DAX calculation in Measure].
1. source: SQL Server
2. in web application handling 30 timezone (seven abbreviation) and also user can change thier timezone and in application screen user can see the datetime converted to that timezone. its a feature client wanted and I'm trying to bring that feature in report level also .
3. That said, I have created the measure which calaculates the DST and Non-DST range for each year from a datetime column passed to it. along with DST_FLAG as some region not follow dst but comes under same abbreviation (ex: pacific/Honolulu, HST and America/Adak, HST/HDT)
4. this measure then i added in table visuals along with other information that i want to show to user.
The measure converts the datetime to user timezone as i was passing the zone information using disconnected 'Dim_User_TimeZone' table in direct query mode with RLS (ex: [USER_ID]=USERNAME()) and role name is 'USER
note: if user changes the timezone info in web application I can fetch latest zoneinformation as this table in DQ.
5. from application in generateToken I will pass the current logged in user id with role name 'USER', so this will pass to the timezone table i will get the current logged in user timezone info like zonename,abbreviation,dst_flag.
6. below is the measure i used for both Direct query and import mode storage report.
CallDate =
VAR SelectedDate = MIN(Fact_Call_Track[CALL_DATE])
RETURN
IF(
ISBLANK(SelectedDate),
BLANK(),
VAR TimeZone = MIN(Dim_User_TimeZone[Abbreviation])
VAR DSTFlag = MIN(Dim_User_TimeZone[DST_FLAG])
// DST boundaries
VAR Year_ = YEAR(SelectedDate)
VAR MarchFirst = DATE(Year_, 3, 1)
VAR FirstSundayInMarch = MarchFirst + MOD(7 - WEEKDAY(MarchFirst, 2), 7)
VAR DSTStart = FirstSundayInMarch + 7 + TIME(2, 0, 0) //adding 7 To get the second sunday of March
VAR NovFirst = DATE(Year_, 11, 1)
VAR FirstSundayInNov = NovFirst + MOD(7 - WEEKDAY(NovFirst, 2), 7)
VAR DSTEnd = FirstSundayInNov + TIME(2, 0, 0)
VAR IsInDST = SelectedDate >= DSTStart && SelectedDate < DSTEnd
VAR TotalOffsetMinutes =
SWITCH(TimeZone,
"CST", -360,
"EST", -300,
"HST", -600,
"AKST", -540,
"PST", -480,
"MST", -420,
"IST", 330,
0
) +
IF(DSTFlag = 1 && IsInDST, 60, 0)
VAR AdjustedDate = SelectedDate + (TotalOffsetMinutes / 1440.0)
RETURN
AdjustedDate
)
schema of 'Dim_User_TimeZone' looks like below:
USER_ID TIME_ZONE_NAME Abbreviation DST_FLAG
test_user_1014 Asia/Kolkata IST 0
test_user_2177 America/Adak HST 1
test_user_4459 America/Phoenix MST 0
test_user_547 Pacific/Honolulu HST 0
test_user_7135 America/Chicago CST 1
test_user_924489 America/Los_Angeles PST 1
The problem:
since it is a measure i can't use it in slicer and X/Y-axis of the chart for trend alanysis and also some of crucial datetime
difference w.r.to user timezone like day bucket, hour bucket to display the count (ex: if i want to show the count of task over hour bucket like 'Overdue', <12hr, <24hr >72hr.)
Second Method [with parameterized dynamic query ]:
1. Created the TZ table with column ZoneName and TimeZoneId at Power Query using option Enter Data.
this table contains both IANA database zone name and windows registry names.
2. created the parameter named 'Zonename' with default value 'Asia/Kolkata'
3. Filtered the TZ table ZoneName column with that parameter.
4. applied the changes and in data model level i have used the Bind to Parameter option for ZoneName column from TZ table
5. note: TZ table is disconnected table and has appx. 460 records mapping b/w IANA and windows registry
TZ table looks like below
ZoneName TimeZoneId
Etc/GMT+12 Dateline Standard Time
Etc/GMT+11 UTC-11
America/Adak Aleutian Standard Time
Pacific/Honolulu Hawaiian Standard Time
Pacific/Marquesas Marquesas Standard Time
America/Anchorage Alaskan Standard Time
America/Los_Angeles Pacific Standard Time
America/Boise Mountain Standard Time
6. in web application config payload i have passed the basic filter using client API see below:
let config = {
type: 'report',
tokenType: tokenType == '0' ? models.TokenType.Aad : models.TokenType.Embed,
accessToken: accessToken,
embedUrl: embedUrl,
id: embedReportId,
permissions: permissions,
settings: {
panes: {
filters: {
visible: true
},
pageNavigation: {
visible: true
}
},
bars: {
statusBar: {
visible: true
}
}
},
filters: [
{
$schema: "http://powerbi.com/product/schema#basic",
target: {
table: "TZ",
column: "ZoneName"
},
operator: "In",
values: ["America/Adak"]
}
]
};
7. Then i have created the direct table for which i want to convert the datetime fields, with below query format
table name: "Converted_Datetime"
let
Source = Sql.Database(Server, Database, [Query="
SELECT
CTRK.CALLTRK_IDN
,CTRK.CALL_DT AT TIME ZONE 'UTC' AT TIME ZONE '"&List.First(Table.Column(TZ,"TimeZoneId"))&"' AS CALL_DATE
,CTRK.FOLLOW_UP_DT AT TIME ZONE 'UTC' At TIME ZONE '"&List.First(Table.Column(TZ,"TimeZoneId"))&"' AS NEXT_FOLLOWUP_DATE
FROM
CALLTRK CTRK WITH(NOLOCK)
WHERE CTRK.ENTITY_ACTIVE = 'Y'
",CreateNavigationProperties=true])
in
Source
8. from this i will handle the conversion at sql level usingAT TIME ZONE function, then build a relationship for my fact table(imported table)with calltrk_idn column from Converted_Datetime and data flows from Converted_Datetime to "Fact_Call_Track" and this will converts import mode report to mixed mode.
9. for reports with direct query I can directly call zoneId from TZ by concatenating the expression(AT TIME ZONE 'UTC' AT TIME ZONE '"&List.First(Table.Column(TZ,"TimeZoneId"))&"') in the respective table sql query.
10. now datetime converts to user timezone at column level and now i can use at slicer, line,bar chart for trend analaysis and also datetime difference calaculation w.r.to user timezone.
It may look a bit complex, but I have implemented this approach and tested it successfully with concurrent users, as well as with the same user switching between different timezones. In all cases, the datetime column is correctly converted to the user’s selected timezone.
Note: One of the key requirements is to avoid on-demand dataset refresh when a user updates their timezone at web application in Import mode Report. With the measure-based approach, the converted datetime is displayed at visual rendering time (for table visuals), so no refresh is required. I also explored a second approach using an alternative method to address this requirement.
I’m looking for guidance or best practices to handle this scenario more cleanly or in a recommended way.
Hello @Nagaraj_D
Try these
Store all datetimes in UTC in the source and model.
Do NOT convert time zones in Power Query for multi-user embedded reports.
Use DAX-based conversion at visual/query time using:
Disconnected Dim_User_TimeZone
RLS or Embed token (per user)
DST logic in measures
Accept that measures can’t be used on slicers/axes.
Hi — you’ve done a solid job exploring the two main patterns. The key thing to know is that per-user time zone conversion (especially with users changing time zones on the fly) can’t be modeled as a true “date column” in Import mode without some trade-off.
Here’s the clean guidance / best practice most of us recommend in this scenario:
1) Keep the model in UTC, do per-user conversion at query/visual time (DAX), not in Power Query
Your measure approach is the correct direction for “no refresh required” and for per-user behavior.
Yes, the limitation is real: a measure can’t be used as a slicer field or axis.
2) Don’t use Dynamic M Parameters for per-user time zone in a shared dataset
Dynamic M parameters are great for global parameterization, but they’re not meant for “each viewer gets their own parameter value”.
If the parameter value changes, it effectively changes the query behavior for the dataset/report, not isolated per viewer (so it’s risky in multi-user embedding).
3) Recommended workaround for slicers/axis: use a “local time dimension” + TREATAS pattern
Instead of trying to put the converted datetime itself on the axis:
Create a disconnected Date/Hour (or DateTime) dimension to use on slicers/axis (Day, Hour, Buckets, etc.)
In measures, compute the user’s offset/DST and then map UTC facts into the selected “local” buckets using TREATAS / virtual columns.
This is the typical approach used for “dynamic timezone by slicer/user” scenarios.
4) For your “hour bucket / overdue / <12hr / <24hr” scenarios
These should remain measures anyway:
Compute “now” in user local time (using the selected timezone offset)
Compute the difference in minutes/hours against the UTC timestamp adjusted to local
Return counts by bucket (the bucket table can be a small disconnected table used on axis)
Bottom line
If your #1 requirement is no refresh + per-user timezone that can change anytime, then DAX-at-render-time + disconnected axis tables is the most correct/maintainable approach.
Your Dynamic M / AT TIME ZONE approach is great for single-user or globally-set timezone, but it’s not the recommended pattern for multi-user embedded reports because parameterization isn’t truly per-user.
Hi @mohit_sakhare ,
Thanks for the detailed information.
I tested the shared dataset using the dynamic parameter approach in an embed scenario and confirmed that the DateTime columns are converted correctly based on the user’s time zone.
I will also try the approach you mentioned using a disconnected table and will share my observations afterward.
Regarding the measure-based approach, first thing is that I had hardcoded the time zone abbreviations [need more dynamic way of doing it.], and one more thing i forgot to metion earlier as I encountered a row count mismatch in a table visual.
For example:
this is the table expected
mbridn name addr caller calldate[utc]
101 ABC addr1 caller1 2025-10-01 10:30:00 AM
101 ABC addr1 caller1 2025-10-02 10:30:00 AM
101 ABC addr1 caller1 2025-10-03 10:30:00 AM
example: in main page if i show the count as 3 and user will drillthrough from this count and navigate to detail report page but user will see blank table visual.
if we use measure for calldate with timezone IST
idn name addr caller calldate[measure]
since all column values are same for that member in table visual shows blank. here instead if i try to bring utc column able to see all three rows.
Currently added the distinct idn column in detail report page table visual and wrapped that column as user can't see but while export they can.I’d appreciate it if the community could share their thoughts or suggestions on this observation with the measure-based approach, and whether this behavior is expected or if there are recommended ways to handle it more effectively.
Thanks in advance!
| User | Count |
|---|---|
| 49 | |
| 37 | |
| 33 | |
| 22 | |
| 18 |
| User | Count |
|---|---|
| 132 | |
| 100 | |
| 56 | |
| 37 | |
| 37 |