Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
https://learn.microsoft.com/en-us/power-bi/admin/service-admin-premium-multi-geo
For speeding up performance for users in other parts of the globe - I see one can assign each premium capacity a different region.
But since each workspace gets one region - does that mean people end up with duplicated workspaces? e.g. - a 'e. asia' version of a workspace, then an identical 's. europe' version? That seems messy, no?
Can a dataset or report be 'duplicated' across workspaces in that case, or?
Solved! Go to Solution.
Hi @mmace1
If the multi-geo is just used for speed then yes you would have to duplicate the data.
If you are using multi-geo for compliance reasons then you would only have the data relevant to the region that it is in.
I have customers access data from around the world to a single region and the time delay is often so small that it is not worth the extra cost for having a report run 0.5-1second faster.
Hi @mmace1
If the multi-geo is just used for speed then yes you would have to duplicate the data.
If you are using multi-geo for compliance reasons then you would only have the data relevant to the region that it is in.
I have customers access data from around the world to a single region and the time delay is often so small that it is not worth the extra cost for having a report run 0.5-1second faster.
@GilbertQ If I'm understanding this correctly, the semantic models and data for reports stored on a Premium/ Fabric capacity located in a region that is different from the tenant's home region should reside in the capacity's region, is that correct? I can't seem to find a definitive answer to this anywhere.
Thanks!
Gotcha, thanks!
Kinda of a moot followpu but @GilbertQ Is there a good way to duplicate the data among workspaces? Or just a 'publish this one dataset to all 3 workspaces, then refresh them all individually' situation?
Hi @mmace1
You could use some automation to publish it to the 3 workspaces, or if it is a once off just publish the dataset 3 times to each workspace and then as you said refresh them via the scheduled refresh.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
28 | |
26 | |
22 | |
22 | |
18 |
User | Count |
---|---|
52 | |
34 | |
28 | |
24 | |
21 |