Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.


As an ISV building solutions on top of Microsofts business platforms, our organization needs to have a well defined approach to deploy Power BI Reports and Datasets that can be part of our continuous delivery systems within Azure DevOps delivery pipelines.

In order to properly repoint our power bi reports to a newly deployed dataset, we need the PbiServiceModelId value, which appears to be a unique numeric id assigned to a dataset. The public powerbi REST api does not return this value, only the dataset's guid value.


Using various tracing tools, I found I was able to get the PbiServiceModelId values using an undocumented api. In commercial cloud, once I get a valid powerbi context, I can call:


and it will return the id values (along with other metadata) for every dataset my user has access to.


Unfortunately, we require this to be part of an automated pipeline. For all of our other pipeline work, we use a service principal to call the REST api. The above rest call does not work with a service principal - it will return the error "API is not accessible for application". It will only work when using an actual Azure account.


So three questions:

1) Is there a different (more acceptable) way to properly repoint our power bi reports to a newly deployed dataset? 

2) If not, is there a better way to get the PbiServiceModelId for a dataset?

3) If there is no better way, then is there a way to make the (undocumented) rest call using a service principal account?


As a final note, we need this to work in both commercial and gov cloud. I have the url for the undocumented api for both.

Status: New
Community Support

hi  @psyang 

You may try these way to get PbiServiceModelId 


a. Create another PowerBI report referencing the dataset you want the ID of, save it, extract it, and look in the Connections file

b. Use the PowerBI REST API (eg. Get Dataset By ID in Group in my case) - note that this requires authentication, as your browser isn't authenticated on

c. Using Chrome or any other developer-friendly browser, open the report list, the Developer Tools, and look in the network responses (in particular, a response from https://*, and check the models property for a list of your available datasets in your current workspace.



and to get faster and better technical support for this issue, you may directly create a support ticket.




Frequent Visitor

I can confirm that a) and c) work, but is not useful in an automated pipeline environment.

I can confirm that b) does not work. The PowerBI REST api does not return the model id. In fact, the Get Dataset By ID in Group does not exist, only "Get Dataset In Group", which only returns the group guid, not the model id.


I have logged a call with Microsoft, but so far their official response ranges from "probably not" to "no" to my three questions.

Not applicable

For anyone else running into this issue for internal organizational use, you can rebind the reports to the new dataset without mucking about with the internals of the PBI report content.



  • You want to promote a dataset and one or more reports from Workspace A (DEV) to Workspace B (QA)
  • You have the dataset and reports in PBIX files



  • Both workspaces are on the same PowerBI tenant
  • The service principal you are using to promote the report has access to both workspaces

Script to promote and rebind the dataset:

$tenantId = "[redacted]"
$servicePrincipalId = "[redacted]"
$servicePrincipalKey = "[redacted]"
$servicePrincipalKeySecureString = ConvertTo-SecureString $servicePrincipalKey -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential($servicePrincipalId , $servicePrincipalKeySecureString)
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $cred -Tenant $tenantId

$workspaceId = "[QA workspace GUID]"
New-PowerBIReport -WorkspaceId $workspaceId -ConflictAction CreateOrOverwrite -Path "C:\Temp\PBI_Deploy\Dataset.pbix"
$report = New-PowerBIReport -WorkspaceId $workspaceId -ConflictAction CreateOrOverwrite -Path "C:\Temp\PBI_Deploy\Report.pbix"

$dataset = Get-PowerBIDataset -Name "Dataset" -WorkspaceId $workspaceId
$body = "{ datasetId: ""$($dataset.Id)"" }"
$rebindUrl = "$workspaceId/reports/$($report.Id)/Rebind"
Invoke-PowerBIRestMethod -Url $rebindUrl -Method Post -Body $body -ContentType 'application/json'



  • The New-PowerBIReport cmdlet uploads the pbix files that contain the dataset and the bound report respectively
  • At this stage (and you can confirm by checking data lineage), the new report in the QA workspace is still linked to the DEV workspace
  • The dataset in the new workspace is read to get its dataset id (Get-PowerBIDataset)
  • The Invoke-PowerBIRestMethod is calling the Rebind Report in Group REST API to point the new report to the new dataset (
  • Once that is complete, you can use data lineage to check that this worked as expected
Frequent Visitor

Thanks for the update. Our setup is a little different, unfortunately, and doesn't allow us to use the rebind api: we began with an onprem solution, so our reports are stored as pbix files in our source repo. 


We have since added a cloud solution, but both cloud and onprem solutions are sourced off of our repo. This means that the pbix file is a local file which is using an SSAS connection, and not in a power bi workspace using a dataset. Under this scenario, we cannot use the rebind api. We would first have to upload the file to a workspace, which requires a valid dataset to already have been set.


We do have a solution, but it currently involves using some undocumented features. Our only real alternative is to have two versions of each report - one for onprem stored in our repo, and one for cloud stored in a worksapce. This is what we were trying to avoid.

Not applicable

Hey PSYang, understood! We have the same problem with some cross-tenant reports where we can't upload the report from pbix as the embedded pbix uses a dataset in a different tenant. To solve that we are also using the undocumented stuff where we use PowerShell to rewrite the connection string in the report file to point it to the new dataset before deployment.


I voted for your idea to give us an API to get the model ID. I think that the simplest fix would be if MS allowed us to upload reports with missing datasets and then allowed us to use the dataset rebind to point it to the correct dataset in the local workspace. Reports can already be deployed with missing credentials so it would be consistent with the current workflow (deploy report, fix dataset problems in the PowerBI portal afterwards)

New Member

@psyang Hi, I came across your post trying to solve the same problem. We're using Azure DevOps to auto deploy pbix files from an Azure Repo. We tried to split the dataset out to it's own file and discovered it's not easy to point a report to a different dataset. Did you just automate altering the connections file? How were you able to automate looking up the ‘Initial Catalog’, ‘PbiServiceModelId’, and ‘PbiModelDatabaseName’?

Frequent Visitor

@eits79 We found we only needed to get PbiServiceModelId and PbiModelDatabaseName (Initial Catalog is same as PbiModelDatabaseName).


PbiModelDatabaseName is a guid. When we upload our model (stored in .bim file) to the workspace as a dataset using TOM, the TOM.Database object representing the dataset will have an ID property which is the PbiModelDatabaseName guid.


The PbiServiceModelId is trickier. We ended up finding/using an undocumented rest api that will return the PbiServiceModelId for every datasets in a workspace. I don't feel comfortable sharing details of the api because it is undocumented, but here's how we discovered it.


We found if we set PbiServiceModelId to 1 in the report, then opened the report in powerbi desktop, and saved it, somehow the report was updated to the correct PbiServiceModelId value. Using that as a clue, we ran fiddler to see what rest api calls were made while the powerbi desktop opened the report. We were able to capture the rest api url, and by playing around with the header a little, was able to reliably replicate the call. The api needs to be run by a power bi user account, and will return the PbiServiceModelId for every report within workspaces that the running user has access to.


Ideally, instead of an actual user, we wanted to be able to execute the api using a service principal which would allow us to automate this whole process without the need to specify an actual user's credentials. This unfortunately does not work.


We raised a support call with MS, and had several discussions with MS engineers. They acknowledged the api, said it would likely not go away (but you never know!), and that it cannot (and would not) be callable by a service principal. So for now we need to specify user's credentials during this process.


That is where we are now. If the api goes away, we will be forced to manage reports in a workspace instead of in our repo for our cloud solution.

New Member

@psyang Thank you, I really appreciate the response. This was very helpful. I was able to automate changing the referenced shared dataset in the Azure DevOps release pipeline with PowerShell. I did have one occurence where I ended up with 2 reports with the same name in the same workspace. I can't reproduce it so I might have had an old manually published report in there.


Do you know what the significane of the RemoteArtifacts.DatasetId and RemoteArtifacts.ReportId are in the Connections file? It seems to work even if I use the same values for different reports and datasets.