Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
My organisation desperately needs some oversight of power bi usage metrics - I've managed to get a PowerShell script working with task scheduler to export the daily usage into csv as per the solution on this blog: http://angryanalyticsblog.azurewebsites.net/index.php/2018/02/16/power-bi-audit-log-analytics-soluti...
Due to export limits (<5000 rows/<90 days) we will need to build this up over time with the daily export.
We also have a SQL Server DW hosted on Azure so I was wondering if anyone had successfully used Azure Automate / runbooks load these csv's into our data warehouse as I feel like this is a more robust solution. Any ideas or suggestions - i've searched the entire PBI community on audit logs and it seems like no one has mentioned this solution?
Cheers
Solved! Go to Solution.
Yes, pretty much. Or if you don't care about history you can do the API calls directly in the Dataflow PowerQuery (rather than with Powershell)
Since these are csv files it is more appropriate to push them into Dataflows. Putting them into a database is just adding more complexity.
So do you mean storing the CSVs on a shared drive somewhere and connecting to this drive via dataflow?
Yes, pretty much. Or if you don't care about history you can do the API calls directly in the Dataflow PowerQuery (rather than with Powershell)
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 56 | |
| 35 | |
| 18 | |
| 14 |