The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
Hi,
for a critical real-time intelligence scenario I need to use an evenstream having an Azure Event Hubs as a source and an eventhouse (KQL database) as a destination.
Obtaining very very good performance is a goal to achieve absolutely!
So, which is the optimal data format about the events to manage?
This data format has to be the same for the events in inputs to the Azure Event hubs, the events returned from the Event Hubs, the events in input to the eventstream and the ones returned from the eventstream to the KQL database, isn't it?
It is crucial to take care of each step to obtain optimal performances.
Any helps to me, please? Many thanks
Solved! Go to Solution.
Hi @pmscorca ,
Optimal Data Format:
JSON is often recommended for real-time data streaming scenarios due to its flexibility and ease of use. It is widely supported and can be efficiently parsed and processed by both Azure Event Hubs and KQL databases.
Consistency Across Steps:
Yes, maintaining a consistent data format across all stages—from input to Azure Event Hubs, through the event stream, and finally to the KQL database—is essential for optimal performance. This consistency helps in reducing the overhead of data transformation and ensures smooth data flow.
For more details, please refer:
Get data from Azure Event Hubs - Microsoft Fabric | Microsoft Learn
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @pmscorca ,
Optimal Data Format:
JSON is often recommended for real-time data streaming scenarios due to its flexibility and ease of use. It is widely supported and can be efficiently parsed and processed by both Azure Event Hubs and KQL databases.
Consistency Across Steps:
Yes, maintaining a consistent data format across all stages—from input to Azure Event Hubs, through the event stream, and finally to the KQL database—is essential for optimal performance. This consistency helps in reducing the overhead of data transformation and ensures smooth data flow.
For more details, please refer:
Get data from Azure Event Hubs - Microsoft Fabric | Microsoft Learn
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, many thanks for your interesting reply.
In order to have events in JSON or Avro format as input for Azure Event Hubs, do I need to specify a Schema Group in Schema Registry, or can I manage the input events in a no-code manner?
Thanks
Hi @pmscorca ,
If you prefer a no-code approach, you can manage the input events without explicitly defining a schema in the Schema Registry. However, this means you won't have the benefits of schema validation and enforcement provided by the Schema Registry. You can still send and receive events in JSON or Avro format directly to and from Azure Event Hubs.
By using the Schema Registry, you can ensure that your events adhere to a defined structure, which can help in maintaining data quality and simplifying data processing.
Please refer:
Create an Azure Event Hubs schema registry - Azure Event Hubs | Microsoft Learn
Azure Schema Registry Concepts - Azure Event Hubs | Microsoft Learn
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.