Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Win a FREE 3 Day Ticket to FabCon Vienna. Apply now

Event Processing: Derive value from complex JSON

Please allow to derive a value out of a complex JSON which is stored in a single field in your stream

Status: Needs Votes
Comments
thomas_pagel
New Member

I know... However it would be great for the "no-code" users to be able to do these kind of actions as well without coding...

xujx
Microsoft Employee

Thank you, Thomas for your feedback.


May I know what the data type is in the field of the stream? Is it a JSON object or JSON array or JSON-like string?


If it is a JSON object, you can use "Manage field" operator to get the data into a new column.

If it is a JSON array, you can use "Expand" operator to expand these arrays into separate column, then use "Manage field" to get the data into another column.

If it is a JSON-like string, it is not supported yet, but in our roadmap.


thomas_pagel
New Member

I'm not that familiar with the terminology of JSON but I guess it's a combination of object and array, with a lot of "key/value" pairs included. You can simply reach out to your colleagues from the D365 F&O team and ask them for a sample of what they deliver as a "data change event" (I can provide you with a sample as well if there's any good way how to transfer it safely)... It's really quite complex and I would like to be safe for schema changes etc. so not working with indexes but more with "keys" to identify the values I would like to have from that JSON. I think it would be just great if you can use Fabric as a sink for events out of D365, the current sync mechanism you have with Data Lake have a latency of about 1h (at least currently). With events you would be able to bring that down to a minute or so. This would enable you to react on something going on in D365 in near real-time what is not possible with other data communications right now. This idea is also connected to the one where I ask for CUD support in Event Processing. If that would work you would be able to sync tables (entities to be precise) from D365 through events, nothing would be more real-time and open up use-cases with Data Activator etc. ...

UriBarash
Microsoft Employee
Complex JSON cells can be processed today using update policies and JSON data mappings in the KQL database. Creating a no-code experience for this will depend on popuarity of the idea. 
fbcideas_migusr
New Member
Status changed to: Needs Votes
 
Martin_Munich
New Member
Voting on the title (and not the idea on creating a no code experience for upsert policies which came up mid thread). In my experience, Eventstreams definitely needs a way to handle more complex stuctures. It's a very typical occurrence. I would be fine with the ability to execute SQL/KQL or prefereably Typescript to do this --> Extend the "transform events" list with a scripting element. For example, we quite often quite big and complex JSONs as time series data from IoT Edge, containing shopfloor data which we need to visualize and act on in a near real-time fashion. The current processing capabilities fall short here and/or are very cumbersome to process such big structures. Using upsert policies is not ideal because it does not allow for any downstream processing of the "cleaned" JSON. For example, we would want the processed JSON to be stored in a KQL database but also have an Activator element analyze and and route it to another destination.