Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
ramankr48
Helper II
Helper II

how to log the data which is triggering alert in data acyivator

Logging and tracking of alaram events for post-analysis and system improvements

I'm working with Microsoft Fabric's Data Activator to monitor real-time device temperatures. My goal is to implement a solution that does the following whenever a temperature exceeds a predefined threshold:

  1. Trigger an Alert:
    Send an email notification to the relevant stakeholder when the threshold is crossed.

  2. Log Alarm Data:
    Write the event into two separate Lakehouse tables:

    • alarm_event_log: stores the actual telemetry data that caused the alarm.

    • alarm_metadata: stores metadata about the alarm (timestamp, severity etc.).

  3. Execute Logging via Notebook:
    Use the "Action" section of the Data Activator rule to trigger a Fabric notebook that logs this data into the Lakehouse. (in action i am using fabric item which is notebook)

What I need help with:

  • How can I retrieve the data from Data Activator in the triggered notebook?
    Specifically, how do we access the relevant data (e.g., device ID, temperature reading, time of alert) that triggered the rule, and then pass or retrieve it within the notebook to log it properly?

  • so like is there any way to retrieve that particular record from activator which caused the alert, if yes then how to implement it, I am planning to write the pyspark script for it, but little confuse about the logic
  • What are the best practices or recommended methods for passing parameters from Data Activator to a Fabric notebook?

  • Has anyone implemented a similar pattern?
    Any sample implementation or guidance on structuring this kind of workflow for reliability and scalability would be very helpful.

Thanks in advance for your support!

4 REPLIES 4
v-tejrama
Community Support
Community Support

Hi @ramankr48 ,

Thanks for reaching out to the Microsoft fabric community forum.

 

Set up Data Activator Rule

In Microsoft Fabric > Data Activator, create a rule with a condition:
e.g - temperature > 70

  • In the Action section:
    • Select Notebook as the Fabric item.
    • Add parameters to send with the alert. Example:
      {
        "deviceId": "{{deviceId}}",
        "temperature": "{{temperature}}",
        "timestamp": "{{timestamp}}",
        "severity": "High",
        "ruleName": "TempThreshold"
      }

In Your Notebook (PySpark)

You'll now receive these parameters via the notebook's parameters dictionary. Here's how to access them and log the event:

 

# Access parameters passed from Data Activator
device_id = parameters.get("deviceId")
temperature = float(parameters.get("temperature"))
event_time = parameters.get("timestamp")
severity = parameters.get("severity")
rule_name = parameters.get("ruleName")

 

Create DataFrames for Logging

 

alarm_event_log (raw event)

from pyspark.sql import SparkSession
from datetime import datetime

 

event_log_df = spark.createDataFrame([{
    "device_id": device_id,
    "temperature": temperature,
    "event_time": event_time
}])

 

 

alarm_metadata (logging info)

metadata_df = spark.createDataFrame([{
    "device_id": device_id,
    "event_time": event_time,
    "logged_time": datetime.now().isoformat(),
    "severity": severity,
    "rule_name": rule_name
}])

 

Write to Lakehouse Tables

Make sure the Lakehouse tables alarm_event_log and alarm_metadata already exist or create them with schema accordingly.

 

# Append to Lakehouse tables -
event_log_df.write.mode("append").saveAsTable("YourLakehouse.alarm_event_log")
metadata_df.write.mode("append").saveAsTable("YourLakehouse.alarm_metadata")

 

If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it.


Best Regards,
Tejaswi.
Community Support

so when you choose a fabric item in action part there is nmp option for input field 

ramankr48_0-1750238329986.pngramankr48_1-1750238338216.png

 

like in the input fields, i need to add this part

{
  "deviceId": "{{deviceId}}",
  "temperature": "{{temperature}}",
  "timestamp": "{{timestamp}}",
  "severity": "High",
  "ruleName": "TempThreshold"
}

 

in the same format or just name will also work

ramankr48_0-1750172043600.png

 

Hi @ramankr48 ,

Thanks for reaching out to the Microsoft fabric community forum.

 

Yes, you don’t need to add the full JSON structure in the Input Fields section.

 

Just type in the names of the fields you want to pass, like:

 

sqlCopyEditdeviceId  
temperature  timestamp  
severity  
ruleName

 

That’s enough, Data Activator will automatically fill in the values from the event that triggered the alert.

 

Then in your notebook, you can access those values using the parameters.get("fieldName") method, like this:

 

device_id = parameters.get("deviceId")

 

So no need to write the whole JSON there, just the field names will work fine. 

 

If the response has addressed your query, please Accept it as a solution and give a'Kudos' so other members can easily find it.


Best Regards,
Tejaswi.
Community Support

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors