Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
jhrjdata
Regular Visitor

Architecture & GPS hardware

Hello Fabric community,

I'm building a fleet monitoring platform for trucks/vehicles and need architectural guidance for using Microsoft Fabric Real-Time Intelligence, as well as GPS hardware recommendations.

 

MY SCENARIO: ------------ - GPS trackers sending location data every 10-60 seconds via TCP - Data includes: GPS coordinates, speed, fuel level, engine RPM, temperature sensors, driver behavior (harsh braking, acceleration) - Need to support multiple tenants (each customer sees only their own fleet) - Planning to scale from 50 vehicles to 1000+ vehicles - Want real-time dashboards + historical analytics + automated alerts - Future: add video/dashcam capabilities

 

MY PLANNED ARCHITECTURE: ------------------------ GPS Devices → TCP Server (Traccar/custom) → Azure Event Hub → Fabric Eventstream → Eventhouse → Real-Time Dashboard QUESTIONS: GPS Hardware: -------------

 

1. What GPS tracker brands/models do you recommend that integrate well with custom backends and Microsoft Fabric? I'm considering Teltonika FMC920, Queclink, or Concox - any experience with these?

 

2. Which GPS devices have the best open protocols for parsing data and sending to Azure/Fabric?

 

3. Any recommendations for GPS trackers with OBD-II support to read vehicle diagnostics (fuel, RPM, error codes)? Microsoft Fabric Architecture: ------------------------------

 

4. Is Eventstream + Eventhouse the right approach for high-frequency GPS telemetry data?

5. What's the recommended way to implement multi-tenancy (row-level security) for different fleet customers?

6. Should I use KQL Database or Lakehouse for storing historical GPS data (months/years)?

7. Any recommendations for handling geofencing alerts in real-time using Activator?

8. What Fabric SKU (F4, F8, F16?) would you recommend starting with for ~100 vehicles sending data every 30 seconds?

I'm based in Mexico and planning to deploy this commercially for fleet owners. Any guidance from people who have built similar IoT/telematics solutions with Fabric would be greatly appreciated! Thanks in advance!

3 ACCEPTED SOLUTIONS
svelde
Most Valuable Professional
Most Valuable Professional

Hello @jhrjdata 

 

Welcome to this Fabric community forum.

 

There are many vendors providing quality GPS trackers. TCP based protocols (like MQTT) are preferred. This provides fast, secure, and reliable communication.  

 

Reading data from the ODB2 connector is nice for customers when they need to connect the device themselves. But probably this offers a subset of the canbus data needed and it's not temper proof. Connecting a device directly to the canbus (or multiple canbusses) is more work but makes it possible to 'hide' the tracker. There are way to read canbus data non-intrusive (no need to cut wires or void warrenty)

 

I have tested the Teltonika FMC150 (which uses the same configurator tool. Please check the specs yourself). The message interval can be controlled and support a wide range of vehicles, probably the same as the tracker you mentioned. It has several ways to connect to the cloud, including to the Azure IoT hub over MQTT.

 

This means that no 3rd party cloud gateway is needed and the Microsoft Fabric eventstream can connect directly to the Azure IoT Hub source. Optionally, an Azure Eventhub can be put in between providing the ability to pause the egress, a practical feature while doing maintenance.

 

You are doing IoT and Real-Time communication so yes, using the Eventstream and Eventhouse are the right tool. Together with the real-time dashboards and Fabric Maps, you get insights directly. KQL databases are perfect for storing data over years. The storage is automatically compressed making the storage cheap in an automated way. Still, years of data, giga bytes, terra bytes or more, can be queried within seconds. One eventhouse can hold multiple KQL databases.   

 

You want to support multiple customers. This brings the question, how can we prevent customers from seeing eachother data? The recommended way for our customers is to ingest the raw messages as one stream but splitting up the messages over multiple customer KQL databases, one for every customer. This makes it easy to manages access rights with EntraID or external tooling (for example, I Azure Managed Grafana, a connection is created on database level).

 

An alternative is splitting the stream (eg via IoT Hub routing) or working with multiple customer tables in the database or using row-level security. This is possible but this makes it a bit more complex in our opnion.  

 

Be aware that the stream must be able to split the customer specific messages. So you need to have enriched the messages with an customer id somether upstream (Perhaps in the device or via and extra, stream analytics job, or Azure Function). Check out the Eventstream SQL transformation and filter transformation if you want to add splitting logic in the Eventstream. Notice that the Eventstream only understands the message body, at this moment message application properties (as seen in the IoT Hub message enrichment) is not supported.

 

Geofencing, we have you covered. The device I tested could do geofencing too but why not use the KQL database geofencing functions as seen in this blog post? Just combine it with the Activator and you can do some nice geofencing.

 

Regarding the SKU, that's not easy to answer. Because Fabric items like the Eventstream and Eventhouse run day and night, adding value 24/7, this you need a solid capacity, or better, two capacities. Use one for ingestion and data storage and the other for user interaction. I recommend spinnig up a few capacities (a bit larger than expected) and start meassuring. Why a bit larger? You pay a bit more for a few weeks but then you can measure without overages or throttling. You can try to calculate Eventstream usage upfront here. 

 

Recently, I gave presentation about IoT location based data, maps and geofencing here

 

--

 

If this answer helps you, a thumbs-up or marking it as accepted answer is appreaciated. All community members with similar questions will benefit by doing so. Your contribution is highly appreciated.

View solution in original post

svelde
Most Valuable Professional
Most Valuable Professional

Hello @jhrjdata 

Thank you for your kind words.

 

Regarding the device being connected to the IoT Hub, if you can execute this IoT Hub integration, you are fine. You will see device messages arrive, at least GPS data if there is a fix. At this moment, it's using SAS tokens of the IoT Hub service. I'm in contact with Teltonika to discuss device keys.

 

Row-level security vs multiple databases, this choice is based on either flexibility and the customer choice to have their data being stored (at rest) separate from other customers. 

So, this is a viable option:

GPS (FMC920) → Azure IoT Hub (MQTT) → [Optional: Event Hub] → Eventstream → Eventhouse (1 KQL DB per customer)

The maximum number of KQL databases per Eventhouse is 10.000. Splitting the messages is done in the Eventstream over the databases. There seems no limit to the number of eventhouse destinations.

Regarding enrichment, I would add the customer identification at the IoT Hub, as an entry in the device tags and enrich each message with it. If you choose for row-based-access-control, you could postpone it and do this in the database flow.

 

Regarding the calculation, you can do the math yourself. But be aware that both Eventstream, Eventhouse, Activator, etc. need services running at the background so these pick a part of your capacity by default. I suggest, go for an F8, test for a few weeks, measure, and see how much consumption there is. Do not start too small, then you will encounter throthling and overages which slows down you development phase. You can then scale down if needed. Bonus tip: put Eventstream and Eventhouse in a separate capacity so this one is not affected by noisy neighbours.

 

Regarding the geofencing, we go for the activator on top of Eventhouse approach. This is not a full streaming solution but it can be executed every few minutes and that is ok for our usecase. An alternative flow is filling a table with geofencing outcomes (via medallion architecture) and use Eventhouse/ADX CDC for a second Eventstream. Depending on the value, this extra architecture can be valuable.

 

In the end, you're still flexible in your architecture so do a few experiment and learn what fits best for you.

 

--

 

Please mark this answer as accepted answer if it helps you. All community members with similar questions will benefit by doing so. Your contribution is highly appreciated.

View solution in original post

svelde
Most Valuable Professional
Most Valuable Professional

@jhrjdata ,

I double-checked the Eventstream for app settings support. Only system properties are supported.

So, moving application properties (added via IoT Hub message enrichment) to the body must be done before the message arrives at the Eventstream. This is possible via eg. Azure Stream Analytics or Azure functions.

 

As an alternative, row based access control is still an option. 

View solution in original post

4 REPLIES 4
svelde
Most Valuable Professional
Most Valuable Professional

@jhrjdata ,

I double-checked the Eventstream for app settings support. Only system properties are supported.

So, moving application properties (added via IoT Hub message enrichment) to the body must be done before the message arrives at the Eventstream. This is possible via eg. Azure Stream Analytics or Azure functions.

 

As an alternative, row based access control is still an option. 

svelde
Most Valuable Professional
Most Valuable Professional

Hello @jhrjdata 

Thank you for your kind words.

 

Regarding the device being connected to the IoT Hub, if you can execute this IoT Hub integration, you are fine. You will see device messages arrive, at least GPS data if there is a fix. At this moment, it's using SAS tokens of the IoT Hub service. I'm in contact with Teltonika to discuss device keys.

 

Row-level security vs multiple databases, this choice is based on either flexibility and the customer choice to have their data being stored (at rest) separate from other customers. 

So, this is a viable option:

GPS (FMC920) → Azure IoT Hub (MQTT) → [Optional: Event Hub] → Eventstream → Eventhouse (1 KQL DB per customer)

The maximum number of KQL databases per Eventhouse is 10.000. Splitting the messages is done in the Eventstream over the databases. There seems no limit to the number of eventhouse destinations.

Regarding enrichment, I would add the customer identification at the IoT Hub, as an entry in the device tags and enrich each message with it. If you choose for row-based-access-control, you could postpone it and do this in the database flow.

 

Regarding the calculation, you can do the math yourself. But be aware that both Eventstream, Eventhouse, Activator, etc. need services running at the background so these pick a part of your capacity by default. I suggest, go for an F8, test for a few weeks, measure, and see how much consumption there is. Do not start too small, then you will encounter throthling and overages which slows down you development phase. You can then scale down if needed. Bonus tip: put Eventstream and Eventhouse in a separate capacity so this one is not affected by noisy neighbours.

 

Regarding the geofencing, we go for the activator on top of Eventhouse approach. This is not a full streaming solution but it can be executed every few minutes and that is ok for our usecase. An alternative flow is filling a table with geofencing outcomes (via medallion architecture) and use Eventhouse/ADX CDC for a second Eventstream. Depending on the value, this extra architecture can be valuable.

 

In the end, you're still flexible in your architecture so do a few experiment and learn what fits best for you.

 

--

 

Please mark this answer as accepted answer if it helps you. All community members with similar questions will benefit by doing so. Your contribution is highly appreciated.

svelde
Most Valuable Professional
Most Valuable Professional

Hello @jhrjdata 

 

Welcome to this Fabric community forum.

 

There are many vendors providing quality GPS trackers. TCP based protocols (like MQTT) are preferred. This provides fast, secure, and reliable communication.  

 

Reading data from the ODB2 connector is nice for customers when they need to connect the device themselves. But probably this offers a subset of the canbus data needed and it's not temper proof. Connecting a device directly to the canbus (or multiple canbusses) is more work but makes it possible to 'hide' the tracker. There are way to read canbus data non-intrusive (no need to cut wires or void warrenty)

 

I have tested the Teltonika FMC150 (which uses the same configurator tool. Please check the specs yourself). The message interval can be controlled and support a wide range of vehicles, probably the same as the tracker you mentioned. It has several ways to connect to the cloud, including to the Azure IoT hub over MQTT.

 

This means that no 3rd party cloud gateway is needed and the Microsoft Fabric eventstream can connect directly to the Azure IoT Hub source. Optionally, an Azure Eventhub can be put in between providing the ability to pause the egress, a practical feature while doing maintenance.

 

You are doing IoT and Real-Time communication so yes, using the Eventstream and Eventhouse are the right tool. Together with the real-time dashboards and Fabric Maps, you get insights directly. KQL databases are perfect for storing data over years. The storage is automatically compressed making the storage cheap in an automated way. Still, years of data, giga bytes, terra bytes or more, can be queried within seconds. One eventhouse can hold multiple KQL databases.   

 

You want to support multiple customers. This brings the question, how can we prevent customers from seeing eachother data? The recommended way for our customers is to ingest the raw messages as one stream but splitting up the messages over multiple customer KQL databases, one for every customer. This makes it easy to manages access rights with EntraID or external tooling (for example, I Azure Managed Grafana, a connection is created on database level).

 

An alternative is splitting the stream (eg via IoT Hub routing) or working with multiple customer tables in the database or using row-level security. This is possible but this makes it a bit more complex in our opnion.  

 

Be aware that the stream must be able to split the customer specific messages. So you need to have enriched the messages with an customer id somether upstream (Perhaps in the device or via and extra, stream analytics job, or Azure Function). Check out the Eventstream SQL transformation and filter transformation if you want to add splitting logic in the Eventstream. Notice that the Eventstream only understands the message body, at this moment message application properties (as seen in the IoT Hub message enrichment) is not supported.

 

Geofencing, we have you covered. The device I tested could do geofencing too but why not use the KQL database geofencing functions as seen in this blog post? Just combine it with the Activator and you can do some nice geofencing.

 

Regarding the SKU, that's not easy to answer. Because Fabric items like the Eventstream and Eventhouse run day and night, adding value 24/7, this you need a solid capacity, or better, two capacities. Use one for ingestion and data storage and the other for user interaction. I recommend spinnig up a few capacities (a bit larger than expected) and start meassuring. Why a bit larger? You pay a bit more for a few weeks but then you can measure without overages or throttling. You can try to calculate Eventstream usage upfront here. 

 

Recently, I gave presentation about IoT location based data, maps and geofencing here

 

--

 

If this answer helps you, a thumbs-up or marking it as accepted answer is appreaciated. All community members with similar questions will benefit by doing so. Your contribution is highly appreciated.

Hi Sander,

Thank you so much for this detailed and insightful response! This is exactly the guidance I was looking for.

I've read both resources you mentioned:
- Your excellent blog post on Geospatial KQL functions for Geofencing - the examples with geo_point_in_circle(), geo_point_in_polygon(), and the buffer functions are exactly what I need for my fleet alerts.
- The Eventstream Pricing article by Anasheh Boisvert - very helpful for capacity planning.

A few follow-up questions:

**1. Direct MQTT to Azure IoT Hub**
This is great news! I was planning to use the Teltonika FMC920 (similar to your FMC150). Can you confirm the FMC920 also supports direct MQTT connection to Azure IoT Hub? I want to avoid unnecessary middleware if possible.

For the MQTT configuration in Teltonika Configurator, do I need to:
- Set up device provisioning in IoT Hub first?
- Use a specific authentication method (SAS token, X.509 certificate)?
- Configure any specific MQTT topics?

**2. Multi-tenancy approach**
Your recommendation of one KQL database per customer is interesting. I was initially planning to use a single database with Row-Level Security.

A few clarifications:
- With separate databases per customer, how do you handle cross-customer analytics (e.g., fleet-wide statistics for my own business insights)?
- Is there a practical limit on the number of KQL databases per Eventhouse?
- For the message enrichment with customer_id, would you recommend doing this in the Teltonika device configuration, or via an Azure Function between IoT Hub and Eventstream?

**3. Architecture confirmation**
Based on your response, would this be the recommended architecture?

GPS (FMC920) → Azure IoT Hub (MQTT) → [Optional: Event Hub] → Eventstream → Eventhouse (1 KQL DB per customer)

**4. Capacity sizing**
Based on the Eventstream pricing article, I calculated my scenario:
- 10-20 vehicles initially, scaling to 500+
- ~30 second reporting interval
- Using IoT Hub source (no connector cost)
- Direct ingestion to Eventhouse (pull destination, no processor cost)
- Estimated ~0.5-1 GB/day initially

This seems similar to Example 1 in the article (0.25 CU/hour). Would an F2 or F4 capacity be sufficient to start? I want some headroom for the Eventhouse queries and real-time dashboards.

**5. Geofencing implementation**
Your blog post clarified my approach perfectly. I'm planning to use:
- geo_point_in_polygon() for zone alerts (warehouses, client locations)
- geo_point_in_circle() for proximity alerts
- geo_distance_2points() for calculating distances between vehicles
- Activator rules to trigger notifications

One question: For real-time geofence alerts, do you recommend processing in the Eventstream (filter transformation) or letting all data flow to Eventhouse and using Activator queries there?

Thank you again for taking the time to help. Your blog series on Microsoft Fabric for IoT Developers is incredibly valuable - I've bookmarked the entire series!

Best regards,
Juan Heriberto Rosas Juarez
@jhrjdata

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

February Fabric Update Carousel

Fabric Monthly Update - February 2026

Check out the February 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors