March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
As a PowerBI administrator, it's always difficult to monitor the on-premises gateways within our organization, especically when the number of gateways has been growing rapidly. Today I'm going to explain how you can effecitivley administrate and monitor those gateways in PowerBI itself. Along with that, I'm going to give lots of Tips & Tricks.
The below Visio Pic is the Architecture of our Entier Gateway monitoring System.
In order to begin the first most step is to have a Service Account with a Pro License, here we are going to call some API's. So instead of using your persona account, you should have a service account with a pro license.
Once you have a service account you should be add these service account to all your Gateway's as a Administrator.
Tips: Instead of adding the service account directorly to the gateway admin's try to create a Distribution List (Group) and add the Service account to the DL, for ease of management.
In order to call the API's we also need to have a Application Id (Client ID) for that you need the below API permissions.
Now we have the Service account and Application id with these we are ready to generate the Bearer token and call the API.
We are going to automating the part of generating the bearer token to call the API. So for this the best way is to use PowerShell, we can also use PBI M-query to generate the Bearer but in that case we need to provide the service account user name and password with in the report which is a kind of unsecure where we expose the username and password with-in the report.
The below PowerShell script I've used to generate the Bearer token and save it in a Text File
Function Get-AADToken { Param( [parameter(Mandatory = $true)][string]$Username, [parameter(Mandatory = $true)][string]$Password, [parameter(Mandatory = $true)][guid]$ClientId, [parameter(Mandatory = $true)][string]$path, [parameter(Mandatory = $true)][string]$fileName ) [Net.ServicePointManager] :: SecurityProtocol = [Net.SecurityProtocolType]::Tls12 $authorityUrl = "https://login.microsoftonline.com/common/oauth2/authorize" $SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force ## load active directory client dll $typePath = "C:\Jay\works\powershell\Microsoft.IdentityModel.Clients.ActiveDirectory.dll" Add-Type -Path $typePath Write-Verbose "Loaded the Microsoft.IdentityModel.Clients.ActiveDirectory.dll" Write-Verbose "Using authority: $authorityUrl" $authContext = New-Object -TypeName Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext -ArgumentList ($authorityUrl) $credential = New-Object -TypeName Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential -ArgumentList ($UserName, $SecurePassword) Write-Verbose "Trying to aquire token for resource: $Resource" $authResult = $authContext.AcquireToken("https://analysis.windows.net/powerbi/api", $clientId, $credential) Write-Verbose "Authentication Result retrieved for: $($authResult.UserInfo.DisplayableId)" New-Item -path $path -Name $fileName -Value $authResult.AccessToken -ItemType file -force; return "SuccessFully Writted on the file"; }
Tips: I've used a command [Net.ServicePointManager] :: SecurityProtocol = [Net.SecurityProtocolType]::Tls12 which will enforce to use the TLS12 if you don't put this in some of the old machines it will fail as those are using the old TLS Version.
You can find the full code along with DLL here
The Actual path of DLL is hardcoded in the above script you need to change as per your location.
You can find the DLL in the Attachments
Create a new Task in the Task Schedular to run the above PowerShell script for every 1 hour since the bearer token will be expired after 1 after.
In the Action, choose powershell and for the arguments give the command like below
-executionpolicy bypass -command "& {. C:\Jay\works\powershell\PowerBI.ps1; Get-AADToken -Username "xxxx@xx.com" -Password "xxxyyyzzz" -ClientId "xxxx11-xx11-x1-x12-xxxx" -path "C:\Jay\works\pbi\Gateway_Report" -fileName "Bearer.txt"} -WindowStyle Hidden
In this command you will give the
Now we are all set to create our report in PowerBI
This is the most interesting and important part, where we are going to call the API directly in our PBI Report. Let's gets started with a blank query.
Our most of the part will be in the Advanced Editor where we will define our custom M-Queries
Tricks Here I'm going tell you a trick to use a Special API which you couldn't able to find in the PowerBI REST API Documentation The API is
https://api.powerbi.com/v2.0/myorg/me/gatewayclusters?$expand=memberGateways&$skip=0
This API will give you the Main Cluster gateway and it's corresponding clusters gateways
Now lets continue our M-query editor in the PBI Desktop, before writing the code let's put up the steps needed to achieve our goal
1. We need to call the Gateway API
2. In order to call the API we need the Bearer Token which we already have it in a Text File called (Bearer.txt) from the previous powershell script through automation from Task Schedular
The Code for this is
let TokenType = "Bearer", Token = Text.FromBinary(File.Contents("C:\Jay\works\pbi\Gateway_Report\Bearer.txt")), Source = Json.Document(Web.Contents("https://api.powerbi.com/v2.0/myorg/me/gatewayclusters?$expand=memberGateways&$skip=0",[Headers=[Authorization=TokenType & " " & Token]])), value = Source[value], #"Converted to Table" = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error), #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"id", "name", "dataSourceIds", "type", "options", "memberGateways"}, {"id", "name", "dataSourceIds", "type", "options", "memberGateways"}), #"Expanded options" = Table.ExpandRecordColumn(#"Expanded Column1", "options", {"CloudDatasourceRefresh", "CustomConnectors"}, {"CloudDatasourceRefresh", "CustomConnectors"}) in #"Expanded options"
Is that Cool !!
Now we have the full gateway clusters with us. Wait .. With this list we only having the gateway details but we couldn't able to get the current gatewayStatus which we are looking for !!
Tricks If you looked into PowerBI Gateways API documentation it is wrongly mentioned that this API will give the status, but actually it will not give you the status of the gateway. I've confirmed this from officially from the PowerBI community forum
Inorder to get the current gateway status we need to call another api called Gateways - Get Gateway unforunately this API call we need to for each and every gateways seperately and not as a bulk operation.
Let's continue our coding, the below code will call the Get Gateway API for each and every gateway id which we obtained from the previous result
#"Add Column" = Table.AddColumn(#"Expanded options", "DataFromURLsColumn", each Json.Document(Web.Contents("https://api.powerbi.com/v1.0/myorg/",[RelativePath= "gateways/" & [id] , Headers=[Authorization=TokenType & " " & Token]])))
Now finally our code looks like the below one
let TokenType = "Bearer", Token = Text.FromBinary(File.Contents("C:\Jay\works\pbi\Gateway_Report\Bearer.txt")), Source = Json.Document(Web.Contents("https://api.powerbi.com/v2.0/myorg/me/gatewayclusters?$expand=memberGateways&$skip=0",[Headers=[Authorization=TokenType & " " & Token]])), value = Source[value], #"Converted to Table" = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error), #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"id", "name", "dataSourceIds", "type", "options", "memberGateways"}, {"id", "name", "dataSourceIds", "type", "options", "memberGateways"}), #"Expanded options" = Table.ExpandRecordColumn(#"Expanded Column1", "options", {"CloudDatasourceRefresh", "CustomConnectors"}, {"CloudDatasourceRefresh", "CustomConnectors"}), #"Add Column" = Table.AddColumn(#"Expanded options", "DataFromURLsColumn", each Json.Document(Web.Contents("https://api.powerbi.com/v1.0/myorg/",[RelativePath= "gateways/" & [id] , Headers=[Authorization=TokenType & " " & Token]]))) in #"Add Column"
Tips : In order to identify the gateway environment like Production/non-production, you can specify in the gatewayDepartment while configuring the gateway itself, which will be used for us to create filters in our report.
Now after doing some cleanup in the modeling with the below code
let TokenType = "Bearer", Token = Text.FromBinary(File.Contents("C:\Jay\works\pbi\Gateway_Report\Bearer.txt")), Source = Json.Document(Web.Contents("https://api.powerbi.com/v2.0/myorg/me/gatewayclusters?$expand=memberGateways&$skip=0",[Headers=[Authorization=TokenType & " " & Token]])), value = Source[value], #"Converted to Table" = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error), #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"id", "name", "options", "memberGateways"}, {"id", "name", "options", "memberGateways"}), #"Expanded options" = Table.ExpandRecordColumn(#"Expanded Column1", "options", {"CloudDatasourceRefresh", "CustomConnectors"}, {"CloudDatasourceRefresh", "CustomConnectors"}), #"Add Column" = Table.AddColumn(#"Expanded options", "DataFromURLsColumn", each Json.Document(Web.Contents("https://api.powerbi.com/v1.0/myorg/",[RelativePath= "gateways/" & [id] , Headers=[Authorization=TokenType & " " & Token]]))), #"Expanded DataFromURLsColumn" = Table.ExpandRecordColumn(#"Add Column", "DataFromURLsColumn", {"gatewayAnnotation", "gatewayStatus"}, {"gatewayAnnotation", "gatewayStatus"}), #"Parsed JSON" = Table.TransformColumns(#"Expanded DataFromURLsColumn",{{"gatewayAnnotation", Json.Document}}), #"Expanded gatewayAnnotation" = Table.ExpandRecordColumn(#"Parsed JSON", "gatewayAnnotation", {"gatewayDepartment"}, {"gatewayDepartment"}), #"Replaced Value" = Table.ReplaceValue(#"Expanded gatewayAnnotation",null,"",Replacer.ReplaceValue,{"gatewayDepartment"}), #"Added Conditional Column" = Table.AddColumn(#"Replaced Value", "ImageURL", each if [gatewayStatus] = "NotReachable" then "https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSWRknHmZCepToQHUUSlVIiNzZMEX3ALW38FYzdON9P6USruQX-BA" else "https://www.justenergy.ie/portals/1/Images/Help%20and%20Support/GreenStar_ServiceStatus_Final-03.jpg?ver=2017-09-01-144013-867"), #"Replaced Value1" = Table.ReplaceValue(#"Added Conditional Column","","NON PRODUCTION",Replacer.ReplaceValue,{"gatewayDepartment"}), #"Renamed Columns" = Table.RenameColumns(#"Replaced Value1",{{"gatewayDepartment", "Environment"}}), #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"CustomConnectors", type logical}, {"CloudDatasourceRefresh", type logical}}) in #"Changed Type"
Now I'm going to group them based on the data. I've named this query as ClusterGateways because this information is on the cluster level. Then I've created another query with the Reference of the ClusterGateways named as Gateways because this is going to be the tell how many nested/child clusteres available with-in the main Cluster Table
Now Lets me do some modeling in the Gateways with the below code.
let Source = ClusterGateways, #"Removed Columns" = Table.RemoveColumns(Source,{"gatewayStatus", "Environment", "CustomConnectors", "CloudDatasourceRefresh"}), #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"name", "Clustername"}, {"id", "Clusterid"}}), #"Expanded memberGateways" = Table.ExpandListColumn(#"Renamed Columns", "memberGateways"), #"Expanded memberGateways1" = Table.ExpandRecordColumn(#"Expanded memberGateways", "memberGateways", {"name", "version", "annotation"}, {"name", "version", "annotation"}), #"Parsed JSON" = Table.TransformColumns(#"Expanded memberGateways1",{{"annotation", Json.Document}}), #"Expanded annotation" = Table.ExpandRecordColumn(#"Parsed JSON", "annotation", {"gatewayMachine"}, {"gatewayMachine"}), #"Uppercased Text" = Table.TransformColumns(#"Expanded annotation",{{"gatewayMachine", Text.Upper, type text}}) in #"Uppercased Text"
Now we completed our Modeling Part. Let's catch up on the Visualization , Publishing and configuring Alerts inthe flow in the Next Part ... Stay Tuned for the Second Part !
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.