The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
https://learn.microsoft.com/en-us/power-bi/admin/service-admin-premium-multi-geo
For speeding up performance for users in other parts of the globe - I see one can assign each premium capacity a different region.
But since each workspace gets one region - does that mean people end up with duplicated workspaces? e.g. - a 'e. asia' version of a workspace, then an identical 's. europe' version? That seems messy, no?
Can a dataset or report be 'duplicated' across workspaces in that case, or?
Solved! Go to Solution.
Hi @mmace1
If the multi-geo is just used for speed then yes you would have to duplicate the data.
If you are using multi-geo for compliance reasons then you would only have the data relevant to the region that it is in.
I have customers access data from around the world to a single region and the time delay is often so small that it is not worth the extra cost for having a report run 0.5-1second faster.
Hi @mmace1
If the multi-geo is just used for speed then yes you would have to duplicate the data.
If you are using multi-geo for compliance reasons then you would only have the data relevant to the region that it is in.
I have customers access data from around the world to a single region and the time delay is often so small that it is not worth the extra cost for having a report run 0.5-1second faster.
@GilbertQ If I'm understanding this correctly, the semantic models and data for reports stored on a Premium/ Fabric capacity located in a region that is different from the tenant's home region should reside in the capacity's region, is that correct? I can't seem to find a definitive answer to this anywhere.
Thanks!
Gotcha, thanks!
Kinda of a moot followpu but @GilbertQ Is there a good way to duplicate the data among workspaces? Or just a 'publish this one dataset to all 3 workspaces, then refresh them all individually' situation?
Hi @mmace1
You could use some automation to publish it to the 3 workspaces, or if it is a once off just publish the dataset 3 times to each workspace and then as you said refresh them via the scheduled refresh.