Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
I have a source workspace that contains a master semantic model and reports. I want to automate the process of:
What is the ideal approach to achieve this?
Options Considered
Import PBIX files using the Import API:
POST https://api.powerbi.com/v1.0/myorg/groups/{groupId}/imports?datasetDisplayName={datasetDisplayName}Then update the schema using:
POST https://api.powerbi.com/v1.0/myorg/groups/{groupId}/datasets/{datasetId}/Default.UpdateDatasources(Hypothetical—open to better suggestions)
Import PBIT files using the same Import API and then update schema similarly.
Copy semantic model and reports directly from the source folder to destination workspaces and change schema in the semantic model (unsure about the API collection for this).
What is the best approach to implement this? Please provide recommended APIs and steps.
Solved! Go to Solution.
Thank you for contacting the Microsoft Fabric Community Forum.
As mentioned by @GilbertQ , the recommended and industry-standard approach is to use a parameterized PBIT deployment together with the Update Parameters API. This is also the Microsoft-endorsed method for managing multi-workspace or multi-tenant Power BI environments.
Create a master Power BI Template (.pbit) that includes your model relationships, measures, and visuals, along with parameters for key data-source elements such as Server, Database, Schema, or Tenant ID. These parameters allow each workspace to dynamically connect to the correct tenant data.
Once the template is ready, use the Power BI REST APIs to automate deployment by importing the .pbit file into each workspace. After that, update the parameter values for each tenant as shown in the snapshot included in the provided ,MS document.
This method fully supports Delta Lake and Microsoft Fabric Mirror connections. You can update parameters like SchemaName for each workspace so it connects to the appropriate tenant schema within the mirrored database.
 
In short, parameterized PBIT deployment with automated parameter updates is the most scalable, flexible, and Microsoft-supported solution for multi-tenant semantic model management in Power BI and Fabric.
 
I hope this information is helpful. . If you have any further questions, please let us know. we can assist you further.
Regards,
Microsoft Fabric Community Support Team.
 
Hi @ShijuKuchelan 
I wanted to check if you’ve had a chance to review the information provided. If you have any further questions, please let us know. Has your issue been resolved? If not, please share more details so we can assist you further.
Thank You.
Thank you for contacting the Microsoft Fabric Community Forum.
As mentioned by @GilbertQ , the recommended and industry-standard approach is to use a parameterized PBIT deployment together with the Update Parameters API. This is also the Microsoft-endorsed method for managing multi-workspace or multi-tenant Power BI environments.
Create a master Power BI Template (.pbit) that includes your model relationships, measures, and visuals, along with parameters for key data-source elements such as Server, Database, Schema, or Tenant ID. These parameters allow each workspace to dynamically connect to the correct tenant data.
Once the template is ready, use the Power BI REST APIs to automate deployment by importing the .pbit file into each workspace. After that, update the parameter values for each tenant as shown in the snapshot included in the provided ,MS document.
This method fully supports Delta Lake and Microsoft Fabric Mirror connections. You can update parameters like SchemaName for each workspace so it connects to the appropriate tenant schema within the mirrored database.
 
In short, parameterized PBIT deployment with automated parameter updates is the most scalable, flexible, and Microsoft-supported solution for multi-tenant semantic model management in Power BI and Fabric.
 
I hope this information is helpful. . If you have any further questions, please let us know. we can assist you further.
Regards,
Microsoft Fabric Community Support Team.
 
Thank you so much, @GilbertQ . What is the Power BI or industry standard in terms of importing reports and the semantic model?
Create a parameter that is used for your data sources — does this work for a Delta Lake connection?
If I want to experiment with the remaining options, could you provide more details?
That looks like it could work well in terms of importing the PBIX file to the different workspaces instead of updating the data source. If you had to create a parameter that is used for your data sources, you could then update the parameters programmatically in each of your workspaces to point to the data source you require. Here is the API to use. Datasets - Update Parameters In Group - REST API (Power BI Power BI REST APIs) | Microsoft Learn
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.
            | User | Count | 
|---|---|
| 62 | |
| 18 | |
| 12 | |
| 11 | |
| 10 |