Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Preparing for a certification exam? Ask exam experts all your questions on May 15th. Register now.
I am trying to use an Eventstream to load data into a kql database for realtime monitoring of servers. I've develop a custom app in .net and I am able to ingest the events but the processing of those events to be ingested in the kql database has been unsuccessful due to the static nature of the imported schema. I have events like:
{
"table": "Storage",
"rows": [
{
"ServerName": "SERVER1",
"Database": "model",
"DataSizeGb": 20,
"LogSizeGb": 1,
"TimestampUTC": "2024-07-19T16:53:00.6600000",
"TimestampEST": "2024-07-19T12:53:00.6600000"
},
{
"ServerName": "Server1",
"Database": "Test",
"DataSizeGb": 40,
"LogSizeGb": 1,
"TimestampUTC": "2024-07-19T16:53:00.6600000",
"TimestampEST": "2024-07-19T12:53:00.6600000"
}
]
}
and like:
{
"table": "Performance",
"rows": [
{
"ServerName": "SERVER1",
"CPUPercentage": "10",
"Blocking": "0",
"UserConnections": "175",
"ActiveConnections": 3,
"TimestampUTC": "2024-07-19T16:55:00.6400000",
"TimestampEST": "2024-07-19T12:55:00.6530000"
}
]
}
I can create multiple paths in my pre-processing logic to feed a Storage and a Performance table and then expand the rows array on each, but when I try to add a Mange Fields step, to choose which columns will go into each kql table I don't see the resulting schema from each Expand task, but an Imported Schema that reflects the layout of the first record that came into the eventstream and cannot be updated.
Is this an error or is it expected behavior? Is the point that each custom app can produces a single schema? Even in that scenario, how can we update the preprocessing if there are changes in the schema produced by the custom app?
I've tried multiple times to refresh the data preview in different stages of the preprocessor, removing steps and adding them back, creating new eventstreams and in every case the Manage Fields Imported schema is static and tied to some original received event.
Solved! Go to Solution.
thanks for reaching out. that is the multiple schema issue we are handling with now and we also got the feedbacks from other customer feedbacks for this requirement. for our no code editor, we are heavily replying on the data sampling and also can only use one inferred column from sampling data for authoring the operator which will cause some potential issues.
we are proactively working on this new capability and after we have the multiple schema solution and event catalog integration story, we can resolve this scenarios perfectly that means each filter and managed can depend on the different schemas
hopefully we can have this released out in the following 3 ~ 6 months.
we are actively working on this multiple schema support for event stream and the current plan to have it around March, 2025
thanks for reaching out. that is the multiple schema issue we are handling with now and we also got the feedbacks from other customer feedbacks for this requirement. for our no code editor, we are heavily replying on the data sampling and also can only use one inferred column from sampling data for authoring the operator which will cause some potential issues.
we are proactively working on this new capability and after we have the multiple schema solution and event catalog integration story, we can resolve this scenarios perfectly that means each filter and managed can depend on the different schemas
hopefully we can have this released out in the following 3 ~ 6 months.
Hi @galaeci ,
I would like to ask what kind of changes you are referring to in the architecture? Is it an extension of the architecture?
I read the official documentation and found that most of the currently supported data sources only support obtaining row-level data in the data source, but not changes in the schema:
https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/overview?tabs=enhanced...
Additionally, there should be a step to set up the schema when you stream real-time events from a custom application to a Microsoft Fabric KQL database:
https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/stream-real-time-event...
After my own testing, it seems that after setting the architecture here, the architecture model is fixed and will not be changed.
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
is there any update in this in fabric?
i see schema getting changed automatically as per message but due to dofferent schemas sometime new messge with different schema trying ti incorporate into existing schema and values are missing
Check out the April 2025 Fabric update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |