Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Most demos of real-time data processing feel... well, simulated. They use fake data, mock pipelines, and staged spikes. I wanted to create something different: a real-world, real-time demo of Microsoft Fabric’s Data Activator, using the actual energy usage of my home.
In this post, I’ll walk through how I used a Raspberry Pi connected to my electrical panel, Home Assistant (an open-source home automation platform), Azure Event Hub, Fabric Eventstream, Eventhouse, and Data Activator to monitor home power consumption and trigger alerts about my real time power consumption.
Here’s the full pipeline:
Home Assistant: Collects real-time data from my Raspberry Pi, including:
sensor.leg_1_kw and sensor.leg_2_kw: report net grid power per leg (can be negative when exporting solar)
sensor.mppt_kw: power from one solar panel (I have 10 identical panels)
Azure Event Hub: Home Assistant publishes these sensor readings as JSON messages.
Fabric Eventstream: Ingests the raw messages from Event Hub.
Fabric Eventhouse: The Eventstream writes directly to an Eventhouse table for further analysis with KQL.
Microsoft Fabric Data Activator: Monitors the Eventhouse table and sends me a Microsoft Teams notification with my consumption.
Logic
The raw data is pretty verbose and complex and I needed to have a powerful query enginer to aggregate over time and calculate fallback values:
Using KQL in Eventhouse, I wrote (with the help of Github Copilot) a lightweight query to process the stream:
let WindowSize = 5s;
PowerEvents
| where entity_id in ("sensor.leg_1_kw", "sensor.leg_2_kw", "sensor.mppt_kw")
| extend value = todouble(state), event_time = timestamp
| summarize
leg1_kw = maxif(value, entity_id == "sensor.leg_1_kw"),
leg2_kw = maxif(value, entity_id == "sensor.leg_2_kw"),
mppt_kw = maxif(value, entity_id == "sensor.mppt_kw")
by bin(event_time, WindowSize)
| extend
solar_kw = 10 * coalesce(mppt_kw, 0.0),
grid_net_kw = coalesce(leg1_kw, 0.0) + coalesce(leg2_kw, 0.0)
| extend
home_consumption_kw = solar_kw + grid_net_kwThis KQL query calculates real-time home power usage regardless of whether I’m importing or exporting energy. It handles edge cases too — like when there's no solar input at night.
Since Fabric Data Activator currently doesn’t support streaming directly from Eventhouse tables, I used the Eventhouse preview alerting feature:
I configured a streaming alert on the home_consumption_kw column in my Eventhouse table.
This alert automatically created a rule in Data Activator.
In Data Activator, I set up a Teams notification to be sent if home_consumption_kw exceeds 3.5 kW.
⚠️Note: This alerting feature is still in preview at the time of the article. Currently, the alerting engine sends all matching rows every 5 minutes, which is useful for basic scenarios.
This isn’t just a fun home hack. It shows how Microsoft Fabric’s tools — from ingestion to real-time intelligence — can be applied to real-world IoT and edge data.
Eventstream makes data onboarding simple.
Eventhouse + KQL gives powerful yet lightweight analytics.
Data Activator brings real-time actions to life without writing code.
If you're using a SunPower PVS6 solar monitoring system, it's possible to use a Raspberry Pi as a bridge to capture production data. I followed the approach described by Nelson Minar in his excellent blog post: My Custom Solar Monitoring System (PVS6).
I also used the Home Assistant Event Hub integration Azure Event Hub - Home Assistant
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.