Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
SuperFiets_
Helper I
Helper I

Deploying master semantic model to multiple (100) workspaces and editing parameter

Hello everyone,

 

In a Power BI workspace with Git setup I have a "master" dataset that we use to update constantly but is also used for different datasets for clients (datasets, not reports!). This master dataset has a parameter called 'clientName' that is unique for every client. 

 

Our goal is to do the following:

1. Deploy the master dataset to multiple (more than a 100) workspaces.

2. Change the 'clientName' accordingly when deploying the master dataset per workspace (clientName parameter = workspace name).

 

Deployment pipelines would have been an outcome for this exact issue, but for some reason it is only possible to assign the master workspace once... Otherwise we would have simply automated making a 100 deployment pipelines from master -> client. But once the master is used it is (as far as I can see) not possible to re-use it again in another deployment pipeline. 

 

Would someone be able to help me figure out how I can achieve such a thing, I have a few options. But I need some guidance in the right direction on what is the best solution and is workable, since I do not have that much experience with any of the solution I found.

 

1. Use a PowerShell script that clones the master dataset for each workspace and changes the parameter with the Power BI REST API.

2. Use Azure DevOps pipelines, not really familair on how to code these pipelines in Azure DevOps.

3. Use a C# script in Tabular Editor 2 to deploy the model. But since I do not have much experience with C#, this is kind of hard to grasp.

 

Has anyone faced a similar issue? Any help would be so much appreciated!

1 ACCEPTED SOLUTION
v-jianpeng-msft
Community Support
Community Support

Thank you Greg_Deckler 

Hi, @SuperFiets_ 

We can use PowerShell commands to deploy the semantic model into multiple workspaces, and as you described, after deployment, you need to leverage the PowerAPI to change the parameters.
In the following, I'll walk you through deploying a PBIX file into multiple workspaces:

First, we need to follow the PowerShell command of Power BI:

Install-Module -Name MicrosoftPowerBIMgmt

Use the following command to deploy your Main Semantic Model into multiple workspaces:

Connect-PowerBIServiceAccount
$Path = "C:\Users\xxxx\Desktop\Protest.pbix"
$targetWorkspaceIds = ("xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx")
foreach($targetWorkspaceIds in $targetWorkspaceIds){
     New-PowerBIReport -Path $Path -Name "Testaaa" -WorkspaceId $targetWorkspaceIds
} 

The result returned in PowerShell is as follows:

vjianpengmsft_0-1723444486963.png

Check if they were deployed successfully in Power BI:

vjianpengmsft_1-1723444538953.png

You can learn about these PowerShell commands from the following connection:

New-PowerBIReport (MicrosoftPowerBIMgmt.Reports) | Microsoft Learn

vjianpengmsft_2-1723444602160.png

GitHub - microsoft/powerbi-powershell: PowerShell community for Microsoft PowerBI. Here you will fin...

vjianpengmsft_3-1723444630758.png

Since you have more than 100 workspaces, you can consider using the following PowerShell command to list all the workspaces, and then collect the workspace IDs that need to be deployed through PowerShell programming, and then use the above command to deploy with one click.

Get-PowerBIWorkspace -All

 

 

Best Regards

Jianpeng Li

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

5 REPLIES 5
v-jianpeng-msft
Community Support
Community Support

Thank you Greg_Deckler 

Hi, @SuperFiets_ 

We can use PowerShell commands to deploy the semantic model into multiple workspaces, and as you described, after deployment, you need to leverage the PowerAPI to change the parameters.
In the following, I'll walk you through deploying a PBIX file into multiple workspaces:

First, we need to follow the PowerShell command of Power BI:

Install-Module -Name MicrosoftPowerBIMgmt

Use the following command to deploy your Main Semantic Model into multiple workspaces:

Connect-PowerBIServiceAccount
$Path = "C:\Users\xxxx\Desktop\Protest.pbix"
$targetWorkspaceIds = ("xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx")
foreach($targetWorkspaceIds in $targetWorkspaceIds){
     New-PowerBIReport -Path $Path -Name "Testaaa" -WorkspaceId $targetWorkspaceIds
} 

The result returned in PowerShell is as follows:

vjianpengmsft_0-1723444486963.png

Check if they were deployed successfully in Power BI:

vjianpengmsft_1-1723444538953.png

You can learn about these PowerShell commands from the following connection:

New-PowerBIReport (MicrosoftPowerBIMgmt.Reports) | Microsoft Learn

vjianpengmsft_2-1723444602160.png

GitHub - microsoft/powerbi-powershell: PowerShell community for Microsoft PowerBI. Here you will fin...

vjianpengmsft_3-1723444630758.png

Since you have more than 100 workspaces, you can consider using the following PowerShell command to list all the workspaces, and then collect the workspace IDs that need to be deployed through PowerShell programming, and then use the above command to deploy with one click.

Get-PowerBIWorkspace -All

 

 

Best Regards

Jianpeng Li

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi jianpeng,

 

Thank you for your answer!

After a lot of errors, googling and gpt'ing I got myself a script that works perfectly fine in PowerShell.

But in essence it is exactly how you described. 

 

My script does the following while using the Power BI REST API:

1. Exports a master dataset into a memory stream, so it won't appoint to a local folder.

2. Imports the master dataset in a parameterized set of workspaces, all workspaces starting with "officeXXX" will be selected.

3. It waits for the import to be completed and saves the datasetID

4. With the saved datasetID it changes a parameter in the dataset dynamically based on the workspace name (workspace name = dataset parameter value).

 

Last thing I tried is to take over the dataset, but that required OAuth authentication with one of our personal accounts. A better solution we found is implementing a "refresh button" in our main Hub (PowerApps) that dynamically refreshes a dataset with a custom Power Automate flow, which the service principal app will perform.

 

The final PowerShell script will be commited to a git repo in Azure DevOps which will run the .ps1 file in a pipeline. 

Hi, @SuperFiets_ 

You are on the right path. As you mentioned, we can combine PowerShell with other platforms to automate flows.

 

Best Regards

Jianpeng Li

Greg_Deckler
Super User
Super User

@SuperFiets_ I think that all of those methods would probably work but I think you are probably setting yourself up for a nightmare in terms of maintenance. Have you investigated row level security options? That would allow you to have a single semantic model that supports all of your customers and your customers would only be able to see their data.



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Power BI Cookbook Third Edition (Color)

DAX is easy, CALCULATE makes DAX hard...

Unfortunately RLS isn't what we are looking for, we need datasets completely filtered on a specific client because eventually each dataset will be connected to a set of specific templates. Long story, but this is the reason we are currently looking to get the master dataset deployment working, as to not do updates one by one.

Do you have any experience or tips on how to get any of the methods working? I found a way to clone reports from the master workspace to other workspaces, but unfortunately the dataset won't be cloned..

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.