October 28 & 29: Experts share their secrets on how to pass the Fabric Analytics Engineer certification exam—live. Learn more
Hi, May I know if anyone can advise is there a way to run PowerBi analytics with the audit log?
Currently I need to export the CSV file manually and import into PowerBi to run analytics pattern.
Would appreciate if anyone could do it.
Solved! Go to Solution.
I have a SQL job that runs this PowerShell daily. I'm no PowerShell expert, I basically put together some stuff from Adam Saxton (https://guyinacube.com/about/) and MS. I'm sure it could be cleaned up. I then bring the files into Power BI...
#Set-ExecutionPolicy Unrestricted -Scope CurrentUser -Force # Option 1 - This can be used to be prompted for credentials #$UserCredential = Get-Credential #Connect-MsolService -credential $UserCredential # Option 2 - If you really want to automate the script, you will # want to hard code the credentials to log into Azure AD. $User = "email@domain.com" $PWord = ConvertTo-SecureString -String "SuperSecretPassword" -AsPlainText -Force $UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord Connect-MsolService -credential $UserCredential # Create the session to Exchange Online $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection # Import the Exchange Online commands Import-PSSession $Session # Controls number of days to retrieve from audit log; customize as required $Number_Days = 1 # Date control variables $End_Date = (Get-Date).ToString("yyyy-MM-dd") $Start_Date = (Get-Date).AddDays(-$Number_Days).ToString("yyyy-MM-dd") $File_Date = (Get-Date).ToString('yyyy-MM-dd_hh_mm_ss') # Output folder and file; customize as required $Output_Folder = "d:\DataImports\PowerBIAuditLogs" $Csv_File = "$Output_Folder\Audit_Log-$File_Date.csv" # Check for output folder; create if required #IF (!(Test-Path $Output_Folder -PathType Container)) { #New-Item -ItemType Directory -Force -Path $Output_Folder #} # Get Audit Log data for PowerBI activities $Result = Search-UnifiedAuditLog -StartDate $Start_Date -EndDate $End_Date -RecordType PowerBI -ResultSize 5000 | Export-Csv $Csv_File # Site and document library name; customize as required; no sub-folders #$SPO_Site = "Your SharePoint Online site URL here" #$SPO_Folder = "Your Document Library name here" # Connect to SharePoint / OneDrive for Business #Connect-PNPOnline -url $SPO_Site -credential $User_Credential # Upload csv file #Add-PnPFile -Path $Csv_File -Folder $SPO_Folder Exit
Hi Caseycook
The Script u shared is logging in using Personal Account,
Is there anyway we can connect using Azure Principal service account??
Hi, Thanks for your detailed explanation.
Do you know if there is information about Favorites reports/dashboards in this auditlog?
I would like to know which are the favorite reports/dashboards marked by the users
Hi @Anonymous as far as I am aware this is not currently possible.
I would like to access the Audit Log details related to Power BI services such as dataset details, count of datasets, number of users, active user details etc, using Power Shell scripts. While i explored i found the possible way to achieve this functionality is through Exchange Online. But i dont have privileges to execute Exchange Online cmdlets. Can you please help me, is there any alternate way to achieve this one.
By default, Audit Logs require Tenant Admin privileges.
But there is a work around we use.
In the Exchange Admin Center, you can create an admin role called "View-Only Audit Logs" or something similar.
For the Roles, assign the "View-Only" Audit Logs role.
Then add the role to the user who should get access. By doing this, a non-Tenant Admin can access the Audit Logs.
Of course there are various security concerns to keep in mind, but it is a techinical solution to get around the limitation of needing Tenant Admin permissions to see the Audit Logs.
Thank you for your quick reply. Would you possibly be able to point me to a reference document that lets me know what is available inside the dataset? Or if you know, is there some powershell command I can use to iterate through all of it so that I can then go back and select the items I am interested in?
I too use this solution, but I have come across a few bits of information it appears to be lacking.
Specifically, I am trying to access some of the items available in the "Report Usage Metrics" feature such as "Distribution Method", "Platforms", and "Report page". Has anyone found a way to access this information programatically?
@jbnawrocki did you ever get an answer on this? It seems it's still not possible to extract Report page information from the audit logs. And with the Usage Metrics currently unreliable I'm looking for alternatives. Thanks.
Hi @GilbertQ
Thank you for your reply and solution.
At the moment is that only the one solution ?
Any other simpler or much convenient way than running power Shell and dumping files into share folder ?
Regards
BDA2
Works like a charm! Anyone knows how to store the output of the script on Azure Storage account? Would be cool to schedule the powershell with an Azure Runbook and store the files on a storage account.
regards,
Stefan
@Stefkus Did you ever figure out how to do this? I have managed to get the powershell script working with task scheduler but I think using Azure runbook and Automate would be much nicer solution