Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
What is the possibility to load data into an EventHouse from a Log Analytics Workspace?
Solved! Go to Solution.
Hi @JonasN ,
I think it is possible to load data from a Log Analytics Workspace into an EventHouse.
Here are some of my personal thoughts on your question:
1. Use KQL to aggregate the data per day in your Log Analytics Workspace.
2. You can use various methods such as Azure Data Factory, Logic Apps, or custom scripts to automate this process.
3. Use the EventHouse ingestion APIs or connectors to load the exported data into your EventHouse. EventHouse supports multiple data ingestion methods, including SDKs, Kafka, and data flows.
4. You can also look at the two documents below:
Eventhouse overview - Microsoft Fabric | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I don't know if you're still looking for a solution to this @JonasN but I found something that works well for us. We are doing this to get all of our application insights data into an Eventhouse pretty cleanly.
1. Setup an Event Hub instance if you don't have one already in Azure.
2. Go into your log analytics instance and go to Data Export. Setup a data expor to that event hub. You can specify which table(s) you want to have exported. Set the event hub as the destination.
Now you have the data going to the event hub and eventhouses are made to easily ingest that data.
Go into Eventhouse(s) in Fabric, create a new kql db if you need a new one, and go to "Get Data". PIck that event hub. Here is where things get tricky.
KQL is amazing at data ingestion - but eventhub data will come in as an array; and you cannot define an ingestion mapping for a KQL table that will explode arrays, so you ahve to do it in 2 steps.
1. Create a really really basic table with one column named records of type dynamic.
2. Set that table as the destination for the ingestion. Let's call it LogAnalyticsIngestion_Raw
3. Define the tables you want to store the data into. You can expedite this by using Data explorer to connect to the log analytics db, and copy the schemas from there (presuming you want them to be the same)
4. Setup an update policy on each table that will query records, run mv-expand to explode the array, and pull out the data you need for X table. Do that for each table.
This worked well for us, and it allowed us to do things like - -- take any synthentic AppPageViews and dump them into a separate table with a shorter retention period,.... ditch all SQL AppDependencies altogether, and so on.
Hi @JonasN ,
I think it is possible to load data from a Log Analytics Workspace into an EventHouse.
Here are some of my personal thoughts on your question:
1. Use KQL to aggregate the data per day in your Log Analytics Workspace.
2. You can use various methods such as Azure Data Factory, Logic Apps, or custom scripts to automate this process.
3. Use the EventHouse ingestion APIs or connectors to load the exported data into your EventHouse. EventHouse supports multiple data ingestion methods, including SDKs, Kafka, and data flows.
4. You can also look at the two documents below:
Eventhouse overview - Microsoft Fabric | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Sorry, this is not a solution for my problem.
Jonas, did you manage to achieve what you were after, if so, how? as I find this post confusing, it has accepted answer, but also says "it's not a solution". Thanks
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
1 | |
1 | |
1 | |
1 | |
1 |