Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
foks
Frequent Visitor

Getting json payload from Azure Event Grid / Event Hubs and just save it to the Lakehouse

Hello,

 

I have Azure Event Grid Namespace topic and I want to save event as it is, in json format to the Lakehouse/Files. But I can't do it. The json is automatically flattened and saved as delta table. How I can change this?

 

 

1 ACCEPTED SOLUTION
v-hjannapu
Community Support
Community Support

Hi @foks,
Thank you  for reaching out to the Microsoft fabric community forum.

When event data from Event Grid / Event Hub is ingested into a Lakehouse using streaming paths, Fabric automatically parses the JSON and stores it as a Delta table for analytics. There is not a supported option today to tell Fabric to keep the incoming payload as a raw JSON file in the Lakehouse Files area.

If your requirement is to preserve the events exactly as received raw JSON, you will need to land them outside the default streaming-to-Lakehouse experience.
For example capture the events using a custom ingestion Azure Function, Logic App, or Data Factory and write them directly as JSON files to storage/Lakehouse Files. If the goal is analytics on event or telemetry data, Eventhouse (KQL) is the recommended approach, as it’s designed for this type of time-series data.

Kindly refer to the below documentation links for better understanding:
 Add a lakehouse destination to an eventstream - Microsoft Fabric | Microsoft Learn

Hope this clarifies why the data is getting flattened and what the available options are today.

Regards,
Community Support Team.

View solution in original post

5 REPLIES 5
v-hjannapu
Community Support
Community Support

Hi @foks,
Thank you  for reaching out to the Microsoft fabric community forum.

When event data from Event Grid / Event Hub is ingested into a Lakehouse using streaming paths, Fabric automatically parses the JSON and stores it as a Delta table for analytics. There is not a supported option today to tell Fabric to keep the incoming payload as a raw JSON file in the Lakehouse Files area.

If your requirement is to preserve the events exactly as received raw JSON, you will need to land them outside the default streaming-to-Lakehouse experience.
For example capture the events using a custom ingestion Azure Function, Logic App, or Data Factory and write them directly as JSON files to storage/Lakehouse Files. If the goal is analytics on event or telemetry data, Eventhouse (KQL) is the recommended approach, as it’s designed for this type of time-series data.

Kindly refer to the below documentation links for better understanding:
 Add a lakehouse destination to an eventstream - Microsoft Fabric | Microsoft Learn

Hope this clarifies why the data is getting flattened and what the available options are today.

Regards,
Community Support Team.

Hi @foks,
I hope the information provided above assists you in resolving the issue. If you have any additional questions or concerns, please do not hesitate to contact us. We are here to support you and will be happy to help with any further assistance you may need.

Regards,
Community Support Team.

Hi @foks,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We are always here to support you.


Regards,
Community Support Team.

svelde
Most Valuable Professional
Most Valuable Professional

Hello @foks ,

Welcome to this community forum.

Have you considered forwarding incoming MQTT messages in the Eventhouse?

Check out this blogpost for details.

I ask this because a timeseries is in most cases a better fit for data arriving as event, measurements, immutable telemetry.

the KQL query language is perfect for working with time-based data.

If you still need a (JSON) file for other tooling, you can use a Lakehouse shortcut so the data appears as a table.

 

If this answer helps you, a thumbs-up or marking it as accepted answer is appreaciated.

Pragati11
Super User
Super User

Hi @foks 

 

Once your data is loaded, a delta folder is created. If you navigate into that, you should be able to see json file as well.

Pragati11_2-1763382202439.png

I am assuming you have a streaming data from Azure Event Hub loaded to KQL database in MS Fabric.

Let me know if my underastnd is correct, otherwise share more details like screenshots here.

 

Best Regards,

Pragati Jain


MVP logo


LinkedIn | Twitter | Blog YouTube 

Did I answer your question? Mark my post as a solution! This will help others on the forum!

Appreciate your Kudos!!

Proud to be a Super User!!

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors