Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hello,
Has anyone managed to set up the Power BI API scanner using Pyspark in a notebook Fabric?
Merci d'avance,
Charline
what are you ultimately trying to achieve?
Hi @lbendlin,
My aim is to collect the information reported by the Scanner API every day in order to monitor Power BI activities.
I'm already using the Scanner API with a Data Pipeline and Dataflow Gen2, but the Dataflow Gen2 takes too long to refresh, so I want to test the Notebook Fabric, but I'm not very comfortable with Python.
With this script, I'd like to run the various APIs of the Scanner API and then retrieve the information in the delta table format.
The usual process is to run the Scanner API calls from Powershell and store the resulting JSON files somewhere where they can be ingested from. There's not really a point doing this from within Power BI/Fabric.
I think it makes perfect sense to do this from within Fabric. Why not centralize your workflow in Fabric, rather than collecting data using one technology (PowerShell), and processing it in another (Fabric).
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 2 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |