Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I have Fabric subscription and am a Global Admin.
And i am trying to get a list of Activityevents via API 'https://api.powerbi.com/v1.0/myorg/admin/activityevents?endDateTime='2024-09-03T00:00:00'&startDateT...'' , but it returns 400 Bad request as below even it seems there is no any syntax error or lack of permission.
i've also tried with service principal which has all permissions including Tenant.readwrite.all
has anyone had same experience or solution about this?
p.s. if this is not the right category to post, my bad 😥
Thanks
Solved! Go to Solution.
The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:
Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:
# auth
config = {
"authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
"client_id": "{YOUR_CLIENT_ID_HERE}",
"scope": ["https://analysis.windows.net/powerbi/api/.default"],
"secret": "{YOUR_CLIENT_SECRET_HERE}",
"endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
}
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
)
result = None
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
logging.info("Got the Token")
print("Got the Token")
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id"))
# get and save events
start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)
delta = timedelta(days=1)
while start_date <= end_date:
activityDate = start_date.strftime("%Y-%m-%d")
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"
access_token = result['access_token']
header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
response = requests.get(url=url, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events = event_entities
cont_count = 1
while continuation_token is not None:
response = requests.get(continuation_uri, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events.extend(event_entities)
cont_count += 1
print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")
df = pd.DataFrame(activity_events)
Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
df[Object_cols] = df[Object_cols].astype(str)
float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
df[float64_cols] = df[float64_cols].astype(str)
sdf = spark.createDataFrame(df)
if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
os.makedirs("/lakehouse/default/Files/activity_events/in")
sdf = sdf.withColumn('Year', year(col("CreationTime")))
sdf = sdf.withColumn('Month', month(col("CreationTime")))
sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
print(f"Output files for {activityDate}")
start_date += delta
The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:
Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:
# auth
config = {
"authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
"client_id": "{YOUR_CLIENT_ID_HERE}",
"scope": ["https://analysis.windows.net/powerbi/api/.default"],
"secret": "{YOUR_CLIENT_SECRET_HERE}",
"endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
}
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
)
result = None
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
logging.info("Got the Token")
print("Got the Token")
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id"))
# get and save events
start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)
delta = timedelta(days=1)
while start_date <= end_date:
activityDate = start_date.strftime("%Y-%m-%d")
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"
access_token = result['access_token']
header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
response = requests.get(url=url, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events = event_entities
cont_count = 1
while continuation_token is not None:
response = requests.get(continuation_uri, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events.extend(event_entities)
cont_count += 1
print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")
df = pd.DataFrame(activity_events)
Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
df[Object_cols] = df[Object_cols].astype(str)
float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
df[float64_cols] = df[float64_cols].astype(str)
sdf = spark.createDataFrame(df)
if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
os.makedirs("/lakehouse/default/Files/activity_events/in")
sdf = sdf.withColumn('Year', year(col("CreationTime")))
sdf = sdf.withColumn('Month', month(col("CreationTime")))
sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
print(f"Output files for {activityDate}")
start_date += delta
it resolved the problem perfectly
Thanks a lot!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
9 | |
5 | |
4 | |
3 | |
3 |