Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hello,
I have Azure Event Grid Namespace topic and I want to save event as it is, in json format to the Lakehouse/Files. But I can't do it. The json is automatically flattened and saved as delta table. How I can change this?
Solved! Go to Solution.
Hi @foks,
Thank you for reaching out to the Microsoft fabric community forum.
When event data from Event Grid / Event Hub is ingested into a Lakehouse using streaming paths, Fabric automatically parses the JSON and stores it as a Delta table for analytics. There is not a supported option today to tell Fabric to keep the incoming payload as a raw JSON file in the Lakehouse Files area.
If your requirement is to preserve the events exactly as received raw JSON, you will need to land them outside the default streaming-to-Lakehouse experience.
For example capture the events using a custom ingestion Azure Function, Logic App, or Data Factory and write them directly as JSON files to storage/Lakehouse Files. If the goal is analytics on event or telemetry data, Eventhouse (KQL) is the recommended approach, as it’s designed for this type of time-series data.
Kindly refer to the below documentation links for better understanding:
Add a lakehouse destination to an eventstream - Microsoft Fabric | Microsoft Learn
Hope this clarifies why the data is getting flattened and what the available options are today.
Regards,
Community Support Team.
Hi @foks,
Thank you for reaching out to the Microsoft fabric community forum.
When event data from Event Grid / Event Hub is ingested into a Lakehouse using streaming paths, Fabric automatically parses the JSON and stores it as a Delta table for analytics. There is not a supported option today to tell Fabric to keep the incoming payload as a raw JSON file in the Lakehouse Files area.
If your requirement is to preserve the events exactly as received raw JSON, you will need to land them outside the default streaming-to-Lakehouse experience.
For example capture the events using a custom ingestion Azure Function, Logic App, or Data Factory and write them directly as JSON files to storage/Lakehouse Files. If the goal is analytics on event or telemetry data, Eventhouse (KQL) is the recommended approach, as it’s designed for this type of time-series data.
Kindly refer to the below documentation links for better understanding:
Add a lakehouse destination to an eventstream - Microsoft Fabric | Microsoft Learn
Hope this clarifies why the data is getting flattened and what the available options are today.
Regards,
Community Support Team.
Hi @foks,
I hope the information provided above assists you in resolving the issue. If you have any additional questions or concerns, please do not hesitate to contact us. We are here to support you and will be happy to help with any further assistance you may need.
Regards,
Community Support Team.
Hi @foks,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We are always here to support you.
Regards,
Community Support Team.
Hello @foks ,
Welcome to this community forum.
Have you considered forwarding incoming MQTT messages in the Eventhouse?
Check out this blogpost for details.
I ask this because a timeseries is in most cases a better fit for data arriving as event, measurements, immutable telemetry.
the KQL query language is perfect for working with time-based data.
If you still need a (JSON) file for other tooling, you can use a Lakehouse shortcut so the data appears as a table.
If this answer helps you, a thumbs-up or marking it as accepted answer is appreaciated.
Hi @foks
Once your data is loaded, a delta folder is created. If you navigate into that, you should be able to see json file as well.
I am assuming you have a streaming data from Azure Event Hub loaded to KQL database in MS Fabric.
Let me know if my underastnd is correct, otherwise share more details like screenshots here.
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.