Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hello everyone!
We use a REST API to deploy reports to workspaces. This executes a python script. We use the service principal as the authorization method. After deployment, the semantic model must be reloaded. But this step ends with an error due to insufficient credentials.
What we tried to use. The PowerBI REST API has several methods for changing dataset.
For example https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-parameters-in-group or https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-datasources-in-group
(latter method is not suitable for updating AzureBlobs creds). But information by these methods is not enough. There is no information about the list and format of options for changing credentials.
So the question is, how can we update the credentials after deployment using the service principal authorization for the AzureBlob instance?
In which direction to look, or maybe there are some examples of changing credentials in this way?
Thank you for your attention!
Hi @samuelstinger ,
Use anonymous and hardcode your credentials in your m query/power query. But I don't recommend it at our organisation.
I hope it will be helpful.
Thanks,
Sai Teja
Hi, @SaiTejaTalasila !
Thanks for your reply!
Could you give a small example of how to do this for AzureBlobs case?
For your information, we use this code for connect to data
let
Source = AzureStorage.Blobs("ourblob"),
deploytest1 = Source{[Name="deploytest"]}[Data],
#"Removed Columns" = Table.RemoveColumns(deploytest1,{"Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"}),
#"Filtered Rows1" = Table.SelectRows(#"Removed Columns", each Text.Contains([Name], _CLIENT_ID_)),
#"Removed Columns1" = Table.RemoveColumns(#"Filtered Rows1",{"Name"}),
#"Filtered Hidden Files1" = Table.SelectRows(#"Removed Columns1", each [Attributes]?[Hidden]? <> true),
#"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File", each #"Transform File"([Content])),
#"Removed Other Columns1" = Table.SelectColumns(#"Invoke Custom Function1", {"Transform File"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File", Table.ColumnNames(#"Transform File"(#"Sample File"))),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Column1", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"data", Int64.Type}})
in
#"Changed Type1"
And I don't quite understand how it's done
Thanks!
Hi @samuelstinger ,
It will look something like this -
let
StorageAccountName = "yourstorageaccountname",
AccountKey = "youraccountkey",
Source = AzureStorage.Blobs("https://" & StorageAccountName & ".blob.core.windows.net/", [ApiKeyName="AccountKey", ApiKeyValue=AccountKey]),
deploytest1 = Source{[Name="deploytest"]}[Data],
#"Removed Columns" = Table.RemoveColumns(deploytest1, {"Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"})
Thanks,
Sai Teja
Oh, @SaiTejaTalasila , thanks for you reply!
Now I tried to apply your approach and got an error
In MS documentation says next
And I suppose it is impossible...
Next I thied to use SAS key and got another error
I was surprised and don't know what else to try...
Any ideas how to get out of this?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
34 | |
21 | |
20 | |
14 | |
12 |
User | Count |
---|---|
21 | |
19 | |
18 | |
14 | |
14 |