Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi, does anyone know if there is a way to use Python semantic link package to extract report usage form all reports that are in a workspace or connected to a certain shared dataset in power bi Service?
Or is there any other good alternative to do this easily?
Solved! Go to Solution.
Already found it 🙂
This works for me:
Thanks you both. I will try and see what will work
One thing I've been doing is using the Fabric Capacity Metrics App that installs a semantic model (and a report) that contains the usage metrics for the previous 14(?) days.
https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-install?tabs=1st
You can then connect to that semantic model with semantic_link_labs aka sempy and extract data from these;
df = spark.createDataFrame(fabric.read_table(dataset = datasetID, table = table, workspace = workspaceID))
Hi spencer_sa,
I got the data from the tables you adviced and loaded them into my lakehouse.
Now next step is to understand how I can get the report usage from that data. Should these be specific items? I guess it would only work if the reports are also in a workspace with fabric capacity?
Ahh, I see you're after geting the report usage;
I've succeeded doing this with the following;
import sempy.fabric as fabric
import json
method = 'GET'
url = 'https://api.powerbi.com/v1.0/myorg/admin/activityevents'
start_time = "'2024-12-11T00:55:00.000Z'"
end_time = "'2024-12-11T12:00:00.000Z'"
filter = "$filter=Activity eq 'viewreport'"
query = '&'.join(['startDateTime='+start_time,'endDateTime='+end_time,filter])
full_url = url + '?' + query
client = fabric.PowerBIRestClient()
response = client.request(method, full_url)
print(json.dumps(json.loads(response.content), indent=2))
@SnoekLaurens, I don't think the semantic model that powers the Report Usage is directly accessible right now. One another way I would think of is using the REST API Get Activity Events (requires admin role though). But it is not as detailed as Report Usage, for e.g. getting the page level information is not possible.
But calling a PBI REST API is much easier in Spark notebook inside Fabric, as you can get the token using mssparkutils.
import requests
from pyspark.sql import SparkSession
import json
from notebookutils import mssparkutils as msu
def pbi_rest_api(method, uri, payload=None):
"""
Makes a REST API call to Power BI. Token is automatically generated from mssparkutils based on the user id running the notebook
Args:
method (str): The HTTP method ('GET' or 'POST').
uri (str): The later part of the REST API URL.
payload (dict): The payload, if it is a POST request.
Returns:
Response object: The response from the REST API.
"""
endpoint = "https://api.powerbi.com/v1.0/myorg"
url = f"{endpoint}/{uri}"
headers = {
"Authorization": "Bearer " + msu.credentials.getToken("pbi"),
"Content-Type": "application/json"
}
session = requests.Session()
try:
response = session.request(method, url, headers=headers, json=payload, timeout=120)
response.raise_for_status()
return response.json()
except Exception as e:
print(f"Exception raised {e}")
What do I need to add to get the results correctly shown in a dataframe?
My results don't seem to be the report views but it shows me the continuationToken
Already found it 🙂
This works for me: