Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
jwryu
Advocate II
Advocate II

Getting a list of Activityevents in Fabric Environment via API

Hi, 

I have Fabric subscription and am a Global Admin.

And i am trying to get a list of Activityevents via API 'https://api.powerbi.com/v1.0/myorg/admin/activityevents?endDateTime='2024-09-03T00:00:00'&startDateT...'' , but it returns 400 Bad request as below even it seems there is no any syntax error or lack of permission.

jwryu_0-1725417431796.png


i've also tried with service principal which has all permissions including Tenant.readwrite.all

has anyone had same experience or solution about this?

 

p.s. if this is not the right category to post, my bad 😥

 

Thanks

1 ACCEPTED SOLUTION
andrewb279
Advocate I
Advocate I

The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:

 

https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal#step-3---enabl...

 

Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:

 

# auth

config = {
            "authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
            "client_id": "{YOUR_CLIENT_ID_HERE}",
            "scope": ["https://analysis.windows.net/powerbi/api/.default"],
            "secret": "{YOUR_CLIENT_SECRET_HERE}",
            "endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
        }

app = msal.ConfidentialClientApplication(
    config["client_id"], authority=config["authority"],
    client_credential=config["secret"],
    )
result = None
result = app.acquire_token_silent(config["scope"], account=None)

if not result:
    logging.info("No suitable token exists in cache. Get a new one from AAD.")
    result = app.acquire_token_for_client(scopes=config["scope"])

if "access_token" in result:
   logging.info("Got the Token")
   print("Got the Token")
else:
    print(result.get("error"))
    print(result.get("error_description"))
    print(result.get("correlation_id"))

# get and save events

start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)

delta = timedelta(days=1)
while start_date <= end_date:
    activityDate = start_date.strftime("%Y-%m-%d")

    url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"

    access_token = result['access_token']
    header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
    response = requests.get(url=url, headers=header)

    response_obj = response.json()
    event_entities = response_obj["activityEventEntities"]
    continuation_uri = response_obj["continuationUri"]
    continuation_token = response_obj["continuationToken"]
    activity_events = event_entities
    cont_count = 1
    while continuation_token is not None:
        response = requests.get(continuation_uri, headers=header)
        response_obj = response.json()
        event_entities = response_obj["activityEventEntities"]
        continuation_uri = response_obj["continuationUri"]
        continuation_token = response_obj["continuationToken"]

        activity_events.extend(event_entities)
        cont_count += 1      

    print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")

    df = pd.DataFrame(activity_events)
    Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
    df[Object_cols] = df[Object_cols].astype(str)

    float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
    df[float64_cols] = df[float64_cols].astype(str)

    sdf = spark.createDataFrame(df)

    if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
        os.makedirs("/lakehouse/default/Files/activity_events/in")
    sdf = sdf.withColumn('Year', year(col("CreationTime")))
    sdf = sdf.withColumn('Month', month(col("CreationTime")))
    sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
    sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
    print(f"Output files for {activityDate}")
    start_date += delta

 

View solution in original post

2 REPLIES 2
andrewb279
Advocate I
Advocate I

The Entra/AAD application for the service principal shouldn't require any delegated permissions in Azure, all the necessary permissions are in the Fabric admin portal. The below link might help:

 

https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal#step-3---enabl...

 

Since you are in Fabric consider pulling data from the Activity Events API in a notebook. Here's a code sample that works in my environment:

 

# auth

config = {
            "authority": "https://login.microsoftonline.com/{YOUR_TENANT_ID_HERE}",
            "client_id": "{YOUR_CLIENT_ID_HERE}",
            "scope": ["https://analysis.windows.net/powerbi/api/.default"],
            "secret": "{YOUR_CLIENT_SECRET_HERE}",
            "endpoint": "https://api.powerbi.com/v1.0/myorg/admin/activityevents"
        }

app = msal.ConfidentialClientApplication(
    config["client_id"], authority=config["authority"],
    client_credential=config["secret"],
    )
result = None
result = app.acquire_token_silent(config["scope"], account=None)

if not result:
    logging.info("No suitable token exists in cache. Get a new one from AAD.")
    result = app.acquire_token_for_client(scopes=config["scope"])

if "access_token" in result:
   logging.info("Got the Token")
   print("Got the Token")
else:
    print(result.get("error"))
    print(result.get("error_description"))
    print(result.get("correlation_id"))

# get and save events

start_date = date(2024, 9, 1)
end_date = date(2024, 9, 2)

delta = timedelta(days=1)
while start_date <= end_date:
    activityDate = start_date.strftime("%Y-%m-%d")

    url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"

    access_token = result['access_token']
    header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
    response = requests.get(url=url, headers=header)

    response_obj = response.json()
    event_entities = response_obj["activityEventEntities"]
    continuation_uri = response_obj["continuationUri"]
    continuation_token = response_obj["continuationToken"]
    activity_events = event_entities
    cont_count = 1
    while continuation_token is not None:
        response = requests.get(continuation_uri, headers=header)
        response_obj = response.json()
        event_entities = response_obj["activityEventEntities"]
        continuation_uri = response_obj["continuationUri"]
        continuation_token = response_obj["continuationToken"]

        activity_events.extend(event_entities)
        cont_count += 1      

    print(f"Took {cont_count} tries to exhaust continuation token for {len(activity_events)} events.")

    df = pd.DataFrame(activity_events)
    Object_cols =[col for col, col_type in df.dtypes.items() if col_type=="object"]
    df[Object_cols] = df[Object_cols].astype(str)

    float64_cols =[col for col, col_type in df.dtypes.items() if col_type=="float64"]
    df[float64_cols] = df[float64_cols].astype(str)

    sdf = spark.createDataFrame(df)

    if not os.path.exists("/lakehouse/default/Files/activity_events/in"):
        os.makedirs("/lakehouse/default/Files/activity_events/in")
    sdf = sdf.withColumn('Year', year(col("CreationTime")))
    sdf = sdf.withColumn('Month', month(col("CreationTime")))
    sdf = sdf.withColumn('Day', dayofmonth(col("CreationTime")))
    sdf.write.mode("append").option("mergeSchema","true").format("parquet").partitionBy("Year","Month","Day").save("Files/activity_events/in")
    print(f"Output files for {activityDate}")
    start_date += delta

 

it resolved the problem perfectly

 

Thanks a lot!

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.