Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
Check it out now!Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi Community,
This is an extended ask of my previous post: Trigger Alerts Notification for successful/failed ... - Microsoft Fabric Community
I was able to log my status using API and pass it to my outlook activity, however when I pass my queryactivities api(https://learn.microsoft.com/en-us/fabric/data-factory/pipeline-rest-api#query-activity-runs) to capture the current status while running the pipeline, I get an error. When I run my notebook I get status inactive, when I call the same notebook in my pipeline I get the error.
Please find the attached screenshot:
Notebook Run:
Pipeline Run:
Is there a way to capture the logs real time and once the pipeline succeeds, can I capture the pipeline status with it's child pipelines and send an email notification something like below:
Any thoughts would really be appreaciated!
Thanks in Advance!
Hrishi
Solved! Go to Solution.
Hi @Hrishi_K_M_2000,
Yes, It is possible to capture logs in real-time and send an email notification once the pipeline succeeds, including details of the main pipeline and its child pipelines:
Steps to Capture Logs and Send Email Notifications:
You can use the Query Activity Runs API to track real-time execution status. If you are seeing Inactive for your notebook, it might not have started yet, or there might be an issue with the pipeline execution.
Try calling the API like this in your notebook or using a REST client:
import requests
import json
base_url = "https://api.fabric.microsoft.com/datafactory/"
pipeline_name = "YourPipelineName"
resource_group = "YourResourceGroup"
subscription_id = "YourSubscriptionID"
factory_name = "YourFactoryName"
# Construct API URL
url = f"{base_url}subscriptions/{subscription_id}/resourceGroups/{resource_group}/providers/Microsoft.DataFactory/factories/{factory_name}/queryActivityRuns?api-version=2018-06-01"
headers = {
"Authorization": "Bearer YOUR_ACCESS_TOKEN",
"Content-Type": "application/json"
}
# Define request payload
payload = {
"filter": f"PipelineName eq '{pipeline_name}'",
"orderBy": "RunStart desc"
}
# Make API request
response = requests.post(url, headers=headers, data=json.dumps(payload))
# Print results
if response.status_code == 200:
print("Activity Runs Data:", response.json())
else:
print("Error:", response.text)
Automate Email Trigger: Use Webhooks or Azure Logic Apps to trigger an email when the pipeline completes.
Thanks @v-prasare ,
Also I tried another workaround, since i don't want to automate using logic apps or webhooks. If someone is checking for a Fabric solution, you can try the below:
I created my flow this way:
You need to create 1 pipeline and 2 notebooks
Notebook 1 - Create a Notebook that calls another Pipeline using API's.
Notebook 2 - captures the latest pipeline log
Pipeline 1 - Create a pipeline with 3 activities Wait > Notebook 2 > Outlook activity
Your Master Pipeline with a Notebook 1 will automatically trigger when pipeline schedule runs, your pipeline 1 is called where it waits (in order for the pipeline status to reflect in monitor (manage latency)) and then the pipeline is logged and outlook activity will send an email notification.
Something like this:
I get the notification something like this:
Thanks,
Hrishi
Hi @Hrishi_K_M_2000,
Yes, It is possible to capture logs in real-time and send an email notification once the pipeline succeeds, including details of the main pipeline and its child pipelines:
Steps to Capture Logs and Send Email Notifications:
You can use the Query Activity Runs API to track real-time execution status. If you are seeing Inactive for your notebook, it might not have started yet, or there might be an issue with the pipeline execution.
Try calling the API like this in your notebook or using a REST client:
import requests
import json
base_url = "https://api.fabric.microsoft.com/datafactory/"
pipeline_name = "YourPipelineName"
resource_group = "YourResourceGroup"
subscription_id = "YourSubscriptionID"
factory_name = "YourFactoryName"
# Construct API URL
url = f"{base_url}subscriptions/{subscription_id}/resourceGroups/{resource_group}/providers/Microsoft.DataFactory/factories/{factory_name}/queryActivityRuns?api-version=2018-06-01"
headers = {
"Authorization": "Bearer YOUR_ACCESS_TOKEN",
"Content-Type": "application/json"
}
# Define request payload
payload = {
"filter": f"PipelineName eq '{pipeline_name}'",
"orderBy": "RunStart desc"
}
# Make API request
response = requests.post(url, headers=headers, data=json.dumps(payload))
# Print results
if response.status_code == 200:
print("Activity Runs Data:", response.json())
else:
print("Error:", response.text)
Automate Email Trigger: Use Webhooks or Azure Logic Apps to trigger an email when the pipeline completes.
Thanks @v-prasare ,
Also I tried another workaround, since i don't want to automate using logic apps or webhooks. If someone is checking for a Fabric solution, you can try the below:
I created my flow this way:
You need to create 1 pipeline and 2 notebooks
Notebook 1 - Create a Notebook that calls another Pipeline using API's.
Notebook 2 - captures the latest pipeline log
Pipeline 1 - Create a pipeline with 3 activities Wait > Notebook 2 > Outlook activity
Your Master Pipeline with a Notebook 1 will automatically trigger when pipeline schedule runs, your pipeline 1 is called where it waits (in order for the pipeline status to reflect in monitor (manage latency)) and then the pipeline is logged and outlook activity will send an email notification.
Something like this:
I get the notification something like this:
Thanks,
Hrishi