Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
MMacarie
Regular Visitor

Why does ingesting from an IoT Hub bring new event-related data?

Hello, I started using Fabric for work and I noticed something when I tried ingesting data from my IoT Hub into a KQL database. I tried to apply a continuous ingestion using an Eventstream and, even though it works, I see some new data coming as well. My IoT Hub is stored in the Azure Portal; the data is sent in JSON format, and I'm taking the body of the JSON as a string into a single field when ingesting (plain raw data). Besides the desired data, I see some fields with IoT Hub information and, most importantly, two time fields that refer to an Event. I do not use an Event Hub, but I'm wondering if behind this there may be an Event Hub as well that helps with the continuous ingestion and somehow signal when new data comes. The time specified in those fields is slightly different from the enqueue time (but at a level of milliseconds).

Could there be an Event Hub behind the IoT Hub? And if yes, can I somehow access it directly?

I work in the biomedical field, and we are looking to integrate Fabric into one of our projects. For now, I am interested in finding a low-cost solution, as the telemetry is sent in large amounts and the ingestion should work continuously. Having unwanted data saved could increase the memory considerably and, with that, the costs as well. 

If anyone has dealt with this before, I would appreciate your help very much. 

1 ACCEPTED SOLUTION
nilendraFabric
Solution Supplier
Solution Supplier

Hello @MMacarie 

 

Azure IoT Hub uses an Event Hubs-compatible endpoint for telemetry ingestion and routing. Here’s a detailed explanation:

 

Why New Event-Related Data Appears
1. IoT Hub Built on Event Hubs Technology
Azure IoT Hub is built on top of Azure Event Hubs and uses its capabilities for device-to-cloud telemetry. Every IoT Hub includes a built-in Event Hubs-compatible endpoint (`messages/events`), which allows you to consume messages using standard Event Hubs mechanisms. This underlying integration is why you see event-related metadata fields such as `EventEnqueuedUtcTime` and `EventProcessedUtcTime`—these are system properties added by Event Hubs during message processing.


System Properties in Ingested Data
When messages are ingested into a KQL database or other analytics tools, system properties from Event Hubs (e.g., enqueue time, partition ID, sequence number) may be included by default. These properties help track when and how events were processed, which is useful for debugging, monitoring, and analytics. The slight time differences you observe (in milliseconds) between these timestamps reflect the time taken for the event to move through the ingestion pipeline.
3. Default Behavior of IoT Hub Routing
By default, all device-to-cloud messages in IoT Hub are routed to the built-in endpoint unless custom routing rules are configured. If no specific filters or transformations are applied during ingestion, the raw message payload and associated metadata are forwarded as-is.

 


Accessing the Underlying Event Hub
Yes, you can directly access the Event Hubs-compatible endpoint of your IoT Hub. This endpoint allows you to consume messages using any Event Hubs-compatible client or service. Here’s how to access it:
1. Navigate to your IoT Hub in the Azure Portal.
2. Go to Built-in endpoints under Hub settings.
3. Copy the Event Hub-compatible endpoint and Event Hub-compatible name.
4. Use these details with an Event Hubs SDK or service (e.g., Azure Functions, Stream Analytics) to directly consume messages.
Reducing Unwanted Data
To avoid ingesting unwanted metadata fields that increase storage costs:
• Use an Event Processing Editor in Microsoft Fabric’s Eventstream to filter out unnecessary fields before saving data to the KQL database.
• Configure custom message routing in IoT Hub to transform or filter messages at the source before they reach the built-in endpoint.
• Alternatively, preprocess data using tools like Azure Stream Analytics or Azure Functions to clean up the payload before ingestion.
Cost-Effective Solutions for Large Telemetry Volumes
For high-volume telemetry scenarios:
• Consider using IoT Hub’s Basic Tier, which is cost-effective for device-to-cloud messaging without advanced features like device twins or direct methods.

 

 

hope these details helps

 

thanks

View solution in original post

2 REPLIES 2
nilendraFabric
Solution Supplier
Solution Supplier

Hello @MMacarie 

 

Azure IoT Hub uses an Event Hubs-compatible endpoint for telemetry ingestion and routing. Here’s a detailed explanation:

 

Why New Event-Related Data Appears
1. IoT Hub Built on Event Hubs Technology
Azure IoT Hub is built on top of Azure Event Hubs and uses its capabilities for device-to-cloud telemetry. Every IoT Hub includes a built-in Event Hubs-compatible endpoint (`messages/events`), which allows you to consume messages using standard Event Hubs mechanisms. This underlying integration is why you see event-related metadata fields such as `EventEnqueuedUtcTime` and `EventProcessedUtcTime`—these are system properties added by Event Hubs during message processing.


System Properties in Ingested Data
When messages are ingested into a KQL database or other analytics tools, system properties from Event Hubs (e.g., enqueue time, partition ID, sequence number) may be included by default. These properties help track when and how events were processed, which is useful for debugging, monitoring, and analytics. The slight time differences you observe (in milliseconds) between these timestamps reflect the time taken for the event to move through the ingestion pipeline.
3. Default Behavior of IoT Hub Routing
By default, all device-to-cloud messages in IoT Hub are routed to the built-in endpoint unless custom routing rules are configured. If no specific filters or transformations are applied during ingestion, the raw message payload and associated metadata are forwarded as-is.

 


Accessing the Underlying Event Hub
Yes, you can directly access the Event Hubs-compatible endpoint of your IoT Hub. This endpoint allows you to consume messages using any Event Hubs-compatible client or service. Here’s how to access it:
1. Navigate to your IoT Hub in the Azure Portal.
2. Go to Built-in endpoints under Hub settings.
3. Copy the Event Hub-compatible endpoint and Event Hub-compatible name.
4. Use these details with an Event Hubs SDK or service (e.g., Azure Functions, Stream Analytics) to directly consume messages.
Reducing Unwanted Data
To avoid ingesting unwanted metadata fields that increase storage costs:
• Use an Event Processing Editor in Microsoft Fabric’s Eventstream to filter out unnecessary fields before saving data to the KQL database.
• Configure custom message routing in IoT Hub to transform or filter messages at the source before they reach the built-in endpoint.
• Alternatively, preprocess data using tools like Azure Stream Analytics or Azure Functions to clean up the payload before ingestion.
Cost-Effective Solutions for Large Telemetry Volumes
For high-volume telemetry scenarios:
• Consider using IoT Hub’s Basic Tier, which is cost-effective for device-to-cloud messaging without advanced features like device twins or direct methods.

 

 

hope these details helps

 

thanks

Thank you for the useful information.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!