Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
I have Fabric subscription and am a Global Admin.
And i am trying to get a list of Activityevents via API 'https://api.powerbi.com/v1.0/myorg/admin/activityevents?endDateTime='2024-09-03T00:00:00'&startDateT...'' , but it returns 400 Bad request as below even it seems there is no any syntax error or lack of permission.
i've also tried with service principal which has all permissions including Tenant.readwrite.all
has anyone had same experience or solution about this?
p.s. if this is not the right category to post, my bad 😥
Thanks
Solved! Go to Solution.
The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:
Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:
# auth
config = {
"authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
"client_id": "{YOUR_CLIENT_ID_HERE}",
"scope": ["https://analysis.windows.net/powerbi/api/.default"],
"secret": "{YOUR_CLIENT_SECRET_HERE}",
"endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
}
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
)
result = None
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
logging.info("Got the Token")
print("Got the Token")
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id"))
# get and save events
start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)
delta = timedelta(days=1)
while start_date <= end_date:
activityDate = start_date.strftime("%Y-%m-%d")
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"
access_token = result['access_token']
header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
response = requests.get(url=url, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events = event_entities
cont_count = 1
while continuation_token is not None:
response = requests.get(continuation_uri, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events.extend(event_entities)
cont_count += 1
print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")
df = pd.DataFrame(activity_events)
Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
df[Object_cols] = df[Object_cols].astype(str)
float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
df[float64_cols] = df[float64_cols].astype(str)
sdf = spark.createDataFrame(df)
if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
os.makedirs("/lakehouse/default/Files/activity_events/in")
sdf = sdf.withColumn('Year', year(col("CreationTime")))
sdf = sdf.withColumn('Month', month(col("CreationTime")))
sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
print(f"Output files for {activityDate}")
start_date += delta
The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:
Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:
# auth
config = {
"authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
"client_id": "{YOUR_CLIENT_ID_HERE}",
"scope": ["https://analysis.windows.net/powerbi/api/.default"],
"secret": "{YOUR_CLIENT_SECRET_HERE}",
"endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
}
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
)
result = None
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
logging.info("Got the Token")
print("Got the Token")
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id"))
# get and save events
start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)
delta = timedelta(days=1)
while start_date <= end_date:
activityDate = start_date.strftime("%Y-%m-%d")
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"
access_token = result['access_token']
header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
response = requests.get(url=url, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events = event_entities
cont_count = 1
while continuation_token is not None:
response = requests.get(continuation_uri, headers=header)
response_obj = response.json()
event_entities = response_obj["activityEventEntities"]
continuation_uri = response_obj["continuationUri"]
continuation_token = response_obj["continuationToken"]
activity_events.extend(event_entities)
cont_count += 1
print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")
df = pd.DataFrame(activity_events)
Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
df[Object_cols] = df[Object_cols].astype(str)
float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
df[float64_cols] = df[float64_cols].astype(str)
sdf = spark.createDataFrame(df)
if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
os.makedirs("/lakehouse/default/Files/activity_events/in")
sdf = sdf.withColumn('Year', year(col("CreationTime")))
sdf = sdf.withColumn('Month', month(col("CreationTime")))
sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
print(f"Output files for {activityDate}")
start_date += delta
it resolved the problem perfectly
Thanks a lot!
User | Count |
---|---|
30 | |
10 | |
4 | |
3 | |
1 |
User | Count |
---|---|
46 | |
15 | |
14 | |
10 | |
9 |