Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello everyone,
In a Power BI workspace with Git setup I have a "master" dataset that we use to update constantly but is also used for different datasets for clients (datasets, not reports!). This master dataset has a parameter called 'clientName' that is unique for every client.
Our goal is to do the following:
1. Deploy the master dataset to multiple (more than a 100) workspaces.
2. Change the 'clientName' accordingly when deploying the master dataset per workspace (clientName parameter = workspace name).
Deployment pipelines would have been an outcome for this exact issue, but for some reason it is only possible to assign the master workspace once... Otherwise we would have simply automated making a 100 deployment pipelines from master -> client. But once the master is used it is (as far as I can see) not possible to re-use it again in another deployment pipeline.
Would someone be able to help me figure out how I can achieve such a thing, I have a few options. But I need some guidance in the right direction on what is the best solution and is workable, since I do not have that much experience with any of the solution I found.
1. Use a PowerShell script that clones the master dataset for each workspace and changes the parameter with the Power BI REST API.
2. Use Azure DevOps pipelines, not really familair on how to code these pipelines in Azure DevOps.
3. Use a C# script in Tabular Editor 2 to deploy the model. But since I do not have much experience with C#, this is kind of hard to grasp.
Has anyone faced a similar issue? Any help would be so much appreciated!
Solved! Go to Solution.
Thank you Greg_Deckler
Hi, @SuperFiets_
We can use PowerShell commands to deploy the semantic model into multiple workspaces, and as you described, after deployment, you need to leverage the PowerAPI to change the parameters.
In the following, I'll walk you through deploying a PBIX file into multiple workspaces:
First, we need to follow the PowerShell command of Power BI:
Install-Module -Name MicrosoftPowerBIMgmt
Use the following command to deploy your Main Semantic Model into multiple workspaces:
Connect-PowerBIServiceAccount
$Path = "C:\Users\xxxx\Desktop\Protest.pbix"
$targetWorkspaceIds = ("xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx")
foreach($targetWorkspaceIds in $targetWorkspaceIds){
New-PowerBIReport -Path $Path -Name "Testaaa" -WorkspaceId $targetWorkspaceIds
}
The result returned in PowerShell is as follows:
Check if they were deployed successfully in Power BI:
You can learn about these PowerShell commands from the following connection:
New-PowerBIReport (MicrosoftPowerBIMgmt.Reports) | Microsoft Learn
Since you have more than 100 workspaces, you can consider using the following PowerShell command to list all the workspaces, and then collect the workspace IDs that need to be deployed through PowerShell programming, and then use the above command to deploy with one click.
Get-PowerBIWorkspace -All
Best Regards
Jianpeng Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thank you Greg_Deckler
Hi, @SuperFiets_
We can use PowerShell commands to deploy the semantic model into multiple workspaces, and as you described, after deployment, you need to leverage the PowerAPI to change the parameters.
In the following, I'll walk you through deploying a PBIX file into multiple workspaces:
First, we need to follow the PowerShell command of Power BI:
Install-Module -Name MicrosoftPowerBIMgmt
Use the following command to deploy your Main Semantic Model into multiple workspaces:
Connect-PowerBIServiceAccount
$Path = "C:\Users\xxxx\Desktop\Protest.pbix"
$targetWorkspaceIds = ("xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx","xxxx-xxx-xxx-xxx")
foreach($targetWorkspaceIds in $targetWorkspaceIds){
New-PowerBIReport -Path $Path -Name "Testaaa" -WorkspaceId $targetWorkspaceIds
}
The result returned in PowerShell is as follows:
Check if they were deployed successfully in Power BI:
You can learn about these PowerShell commands from the following connection:
New-PowerBIReport (MicrosoftPowerBIMgmt.Reports) | Microsoft Learn
Since you have more than 100 workspaces, you can consider using the following PowerShell command to list all the workspaces, and then collect the workspace IDs that need to be deployed through PowerShell programming, and then use the above command to deploy with one click.
Get-PowerBIWorkspace -All
Best Regards
Jianpeng Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi jianpeng,
Thank you for your answer!
After a lot of errors, googling and gpt'ing I got myself a script that works perfectly fine in PowerShell.
But in essence it is exactly how you described.
My script does the following while using the Power BI REST API:
1. Exports a master dataset into a memory stream, so it won't appoint to a local folder.
2. Imports the master dataset in a parameterized set of workspaces, all workspaces starting with "officeXXX" will be selected.
3. It waits for the import to be completed and saves the datasetID
4. With the saved datasetID it changes a parameter in the dataset dynamically based on the workspace name (workspace name = dataset parameter value).
Last thing I tried is to take over the dataset, but that required OAuth authentication with one of our personal accounts. A better solution we found is implementing a "refresh button" in our main Hub (PowerApps) that dynamically refreshes a dataset with a custom Power Automate flow, which the service principal app will perform.
The final PowerShell script will be commited to a git repo in Azure DevOps which will run the .ps1 file in a pipeline.
Hi, @SuperFiets_
You are on the right path. As you mentioned, we can combine PowerShell with other platforms to automate flows.
Best Regards
Jianpeng Li
@SuperFiets_ I think that all of those methods would probably work but I think you are probably setting yourself up for a nightmare in terms of maintenance. Have you investigated row level security options? That would allow you to have a single semantic model that supports all of your customers and your customers would only be able to see their data.
Unfortunately RLS isn't what we are looking for, we need datasets completely filtered on a specific client because eventually each dataset will be connected to a set of specific templates. Long story, but this is the reason we are currently looking to get the master dataset deployment working, as to not do updates one by one.
Do you have any experience or tips on how to get any of the methods working? I found a way to clone reports from the master workspace to other workspaces, but unfortunately the dataset won't be cloned..
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
19 | |
16 | |
14 | |
12 | |
11 |
User | Count |
---|---|
32 | |
25 | |
25 | |
19 | |
19 |