Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreShape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.
Hi,
Our team will be extractting the audit logs from Power BI Admin Portal Manually.
Is there a way to automate, that extracts all the users who accessed the Reports on daily basis.
can someone suggest a better solution, in which Audit logs will be extracted automatically on a regular basis.
Solved! Go to Solution.
Hello @Satyasai,
To fully automate this you should create an app user (Service principal). Here is a guide on how to do it:
https://www.sqlbi.com/articles/creating-a-service-principal-account-for-power-bi-api
When you have created a service principal you should have tenantid, appid and secret.
Then you can use this script to take out yesterdays log:
# Power BI Activity Events
# Parameters
$TenantId = "put yout tenantid here"
$AppId = "put yout appid here" # Service PRincipal ID
$Secret = "put yout secret here" # Secret from Service Principal
# Connect the Service Principal
$password = ConvertTo-SecureString $Secret -AsPlainText -Force
$Creds = New-Object PSCredential $AppId, $password
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $Creds -Tenant $TenantId
$headers = Get-PowerBIAccessToken
1..1 |
foreach {
$Date = (((Get-Date).Date).AddDays(-$_))
$StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)
$EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)
Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate -ResultType JsonString |
Out-File -FilePath "C:\Marius\PowerBI_AudititLog_$(Get-Date -Date $Date -Format yyyyMMdd).json"
}
If you want to take out the last 7 days, you can only change the script to:
# Power BI Activity Events
# Parameters
$TenantId = "put yout tenantid here"
$AppId = "put yout appid here" # Service PRincipal ID
$Secret = "put yout secret here" # Secret from Service Principal
# Connect the Service Principal
$password = ConvertTo-SecureString $Secret -AsPlainText -Force
$Creds = New-Object PSCredential $AppId, $password
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $Creds -Tenant $TenantId
$headers = Get-PowerBIAccessToken
1..7 |
foreach {
$Date = (((Get-Date).Date).AddDays(-$_))
$StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)
$EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)
Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate -ResultType JsonString |
Out-File -FilePath "C:\Marius\PowerBI_AudititLog_$(Get-Date -Date $Date -Format yyyyMMdd).json"
}
I think the limit is 30 days.
If you dont want to use service principal, you need to login every time you get the files. Then you have to use Connect-PowerBIServiceAccount like @m-colbert mention in the earlier posts.
Here you can read how to convert json to csv if this is important for you:
https://www.techcartnow.com/powershell-script-to-convert-complex-nested-json-to-csv-file-format
BR
Marius
I am thinking to include the the "Write back" code to sql within the same powershell script like this but I think I'm missing something with this syntax. Not sure what the best way to handle all of the processing in PowerShell.
$activities | select @{Label='UpdtDt';Expression={$_.CreationTime.replace("T", " ").replace("Z", " ")}},@{Label='ShareDt';Expression={$_.CreationTime.replace("T", " ").replace("Z", " ")}},UserID,ItemName,CapacityName,WorkspaceName,Operation,SharingInformation |
Write-SqlTableData -ServerInstance "serverhostname\MS7SQL1D,49001" -DatabaseName "Audit_DB" -SchemaName "dbo" -TableName "PowerBI_Activity_Log_Master"
Hi @mariussve1
Our requirement is to extract Auditlogs using PowerShell scripting, So the extarcted csv will be placed into a Network directory.
Can you please help on the shell scripting to extract Audit logs.
Hello @Satyasai,
To fully automate this you should create an app user (Service principal). Here is a guide on how to do it:
https://www.sqlbi.com/articles/creating-a-service-principal-account-for-power-bi-api
When you have created a service principal you should have tenantid, appid and secret.
Then you can use this script to take out yesterdays log:
# Power BI Activity Events
# Parameters
$TenantId = "put yout tenantid here"
$AppId = "put yout appid here" # Service PRincipal ID
$Secret = "put yout secret here" # Secret from Service Principal
# Connect the Service Principal
$password = ConvertTo-SecureString $Secret -AsPlainText -Force
$Creds = New-Object PSCredential $AppId, $password
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $Creds -Tenant $TenantId
$headers = Get-PowerBIAccessToken
1..1 |
foreach {
$Date = (((Get-Date).Date).AddDays(-$_))
$StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)
$EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)
Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate -ResultType JsonString |
Out-File -FilePath "C:\Marius\PowerBI_AudititLog_$(Get-Date -Date $Date -Format yyyyMMdd).json"
}
If you want to take out the last 7 days, you can only change the script to:
# Power BI Activity Events
# Parameters
$TenantId = "put yout tenantid here"
$AppId = "put yout appid here" # Service PRincipal ID
$Secret = "put yout secret here" # Secret from Service Principal
# Connect the Service Principal
$password = ConvertTo-SecureString $Secret -AsPlainText -Force
$Creds = New-Object PSCredential $AppId, $password
Connect-PowerBIServiceAccount -ServicePrincipal -Credential $Creds -Tenant $TenantId
$headers = Get-PowerBIAccessToken
1..7 |
foreach {
$Date = (((Get-Date).Date).AddDays(-$_))
$StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)
$EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)
Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate -ResultType JsonString |
Out-File -FilePath "C:\Marius\PowerBI_AudititLog_$(Get-Date -Date $Date -Format yyyyMMdd).json"
}
I think the limit is 30 days.
If you dont want to use service principal, you need to login every time you get the files. Then you have to use Connect-PowerBIServiceAccount like @m-colbert mention in the earlier posts.
Here you can read how to convert json to csv if this is important for you:
https://www.techcartnow.com/powershell-script-to-convert-complex-nested-json-to-csv-file-format
BR
Marius
i struggle a bit because the give link is to handle nested json contents, mine json content is a straight structure, so I end up below. I see there are variants that some need to explicitly use .results property, some don't.
$pathToJsonFile = "c:\Users\Xmu\output.json"
$pathToOutputFile = "c:\Users\Xmu\POWERBIUSAGE.csv"
((Get-Content -Path $pathToJsonFile -Raw) | ConvertFrom-Json) | ConvertTo-Csv -NoTypeInformation | Set-Content $pathToOutputFile
Hello Marius,
I ran the same script u provided with the service principal client id, client secret and tenant id
But it is still giving the following error :
Get-PowerBIActivityEvent : Login first with Login-PowerBIServiceAccount
At line:20 char:1
+ Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-PowerBIActivityEvent], Exception
+ FullyQualifiedErrorId : System.Exception,Microsoft.PowerBI.Commands.Admin.GetPowerBIActivityEvent
Could You Please help me with this.
Hello Marius,
I ran the same script u provided with the service principal client id, client secret and tenant id
But it is still giving the following error :
Get-PowerBIActivityEvent : Login first with Login-PowerBIServiceAccount
At line:20 char:1
+ Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndD ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-PowerBIActivityEvent], Exception
+ FullyQualifiedErrorId : System.Exception,Microsoft.PowerBI.Commands.Admin.GetPowerBIActivityEvent
Could You Please help me with this.
Dear @mariussve1 ,
Could you please let me know how to filter a particular Workspace using Power Shell script?
Regards,
Satya
Hi @mariussve1
This is a very thorough response, kudos!
Quick question, how do you store securets to not use plain text inthe script? Is there a means of obtaining them from Azure KeyVault from PowerShell?
Also, do you currently push data to SQL through PowerShell or you save the files and ingest with SSIS or ADF?
Thanks!
Hi again @Satyasai ,
Yes, you can store secret in akv:
$secret = Get-AzKeyVaultSecret -VaultName "<your-unique-keyvault-name>" -Name "ExamplePassword" -AsPlainText
We are currently using adf, and adf has a component called copy data. This component can recive json formatted responses and store this as rows in azure sql database.
But you can of course use powershell script to save files, and the use adf or ssis or even power automate to load files into sql database table.
Br
Marius
The link I provided has the PowerShell script you will need. It loops through as many days of activity and you want to get. The cmdlet / api can only return a single day at a time. If you want a simple script to get a single days worth of activity you can use the following.
Connect-PowerBIServiceAccount
$logDate = '2022-07-05'
$startDate = $logDate + ‘T00:00:00.000’
$endDate = $logDate + 'T23:59:59.999’
$outFile = "D:\Power BI Event Log\PowerBIAuditLog" + $logDate + ".json"
Get-PowerBIActivityEvent -StartDateTime $startDate -EndDateTime $endDate | Out-File $outFile
Disconnect-PowerBIServiceAccount
This will prompt you for a login but you can replace with ps $credentials to avoid it. It saves the output to a json file. This user must be a Power BI Admin or you need to enable Admin API calls in the tenant settings for other users.
Hope this helps.
Hi @Satyasai
This can be done several ways, but I found using PowerShell was the easiest method. Dropping the daily event log to a json file that can be read into a Power BI report, or push into SQL then into Power BI.
Access the Power BI activity log - Power BI | Microsoft Docs
Hope this helps!
We are using this rest api:
https://docs.microsoft.com/en-us/rest/api/power-bi/admin/get-activity-events
You can make a flow in Power Automate or Azure Data Factory and extract data on daily basis.
We push the data into our sql database and have a Power BI report Build on this table.
Works really nice
Br
Marius
Hi Marius,
I'm curious how are you updating your SQL table with this API resultset. We've been appending to a csv file but now we're thinking to switch to a SQL Server table. I have all the script to get activities based on when we updated it last but I need to update SQL Table somehow instead of appending CSV. Please let me know.
for ($a=0; $a -lt $days_since; $a++) {
$dt = "{0:yyyy-MM-dd}" -f (Get-Date).AddDays(-$days_since + $a)
#$activities = Get-PowerBIActivityEvent -StartDateTime '2021-12-13T00:00:00' -EndDateTime '2021-12-13T23:59:59' -ActivityType 'ShareReport' | ConvertFrom-Json
$st = $dt + "T00:00:00"
$et = $dt + "T23:59:59"
$activities = Get-PowerBIActivityEvent -StartDateTime $st -EndDateTime $et -ActivityType 'ShareReport' | ConvertFrom-Json
$activities | Select-Object UserId,ItemName,CapacityName,WorkspaceName,SharingAction,SharingInformation | Where-Object {($_.WorkspaceName -like 'PersonalWorkspace*') -or ($_.WorkspaceName -like '*[Production]') }
#$activities
$rowMax = $activities.Count
write-host "$($dt) has $($rowMax) rows" -ForegroundColor Yellow
for ($i=0; $i -lt $rowMax; $i++)
{
if ($rowMax -gt 0)
{
write-host "$($activities[$i].UserId) shared : $($activities[$i].ItemName) via $($activities[$i].SharingAction) from $($activities[$i].WorkspaceName) workspace on $($activities[$i].CreationTime)"
#$activities
#"$($activities[$i].CreationTime.Replace('T',' ').Replace('Z','')),$($activities[$i].UserId),$($activities[$i].ItemName),$($activities[$i].CapacityName),$($activities[$i].WorkSpaceName),$($activities[$i].SharingAction),$(($activities[$i].SharingInformation) | % {$_.RecipientEmail + $_.ObjectId +'[' + $_.ResharePermission +']'})" | Out-File -FilePath $output_file -Append -Encoding ASCII
"$($today),$($activities[$i].CreationTime),$($activities[$i].UserId),$($activities[$i].ItemName),$($activities[$i].CapacityName),$($activities[$i].WorkSpaceName),$($activities[$i].SharingAction),$(($activities[$i].SharingInformation) | % {$_.RecipientEmail + $_.ObjectId +'[' + $_.ResharePermission +']'})" | Out-File -FilePath $output_file -Append -Encoding ASCII
}
}
}
I take the full json response and pass to a stored procedure as a varchar(max) and merge the data into tables in SQL. If you overlap times a bit it's ok as you have an id that is unique that you can skip. Using the JSON functios in SQL this is fairly easy to extract what you want.
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Check out the November 2024 Power BI update to learn about new features.
User | Count |
---|---|
35 | |
32 | |
19 | |
12 | |
8 |
User | Count |
---|---|
52 | |
37 | |
29 | |
14 | |
12 |