Hi @ramankr48 ,
Thanks for reaching out to the Microsoft fabric community forum.
Set up Data Activator Rule
In Microsoft Fabric > Data Activator, create a rule with a condition:
e.g - temperature > 70
- In the Action section:
- Select Notebook as the Fabric item.
- Add parameters to send with the alert. Example:
{
"deviceId": "{{deviceId}}",
"temperature": "{{temperature}}",
"timestamp": "{{timestamp}}",
"severity": "High",
"ruleName": "TempThreshold"
}
In Your Notebook (PySpark)
You'll now receive these parameters via the notebook's parameters dictionary. Here's how to access them and log the event:
# Access parameters passed from Data Activator
device_id = parameters.get("deviceId")
temperature = float(parameters.get("temperature"))
event_time = parameters.get("timestamp")
severity = parameters.get("severity")
rule_name = parameters.get("ruleName")
Create DataFrames for Logging
alarm_event_log (raw event)
from pyspark.sql import SparkSession
from datetime import datetime
event_log_df = spark.createDataFrame([{
"device_id": device_id,
"temperature": temperature,
"event_time": event_time
}])
alarm_metadata (logging info)
metadata_df = spark.createDataFrame([{
"device_id": device_id,
"event_time": event_time,
"logged_time": datetime.now().isoformat(),
"severity": severity,
"rule_name": rule_name
}])
Write to Lakehouse Tables
Make sure the Lakehouse tables alarm_event_log and alarm_metadata already exist or create them with schema accordingly.
# Append to Lakehouse tables -
event_log_df.write.mode("append").saveAsTable("YourLakehouse.alarm_event_log")
metadata_df.write.mode("append").saveAsTable("YourLakehouse.alarm_metadata")
If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it.
Best Regards,
Tejaswi.
Community Support