Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
slimbnsalah
Regular Visitor

Exporting Data from AppInsights to Fabric Lakehouse

Hello,

 

Im working on Transferring data from AppInsights to a Lakehouse in a Fabric Capacity on daily bases.

I want to know what is the best way to do it.

 

The solution I can think of is:

Consume the AppInsights API in a DataPipeline or DataFlow in Fabric that exports to a lakehouse.

 

 

My Problem is that data is so huge that it exceeds the limits of what the API can handle, and I want to avoid backing up the data on many times of the day. I want to do it only once a day.

 

I would appreciate if you can help me with this.

8 REPLIES 8
Jerome22
Helper III
Helper III

I know that the team is working on a direct connection to the appinsights logs from wihtin Fabric (like we can connect snowflake now)

so "soon" we will be able to consume these logs without the need of copying them.

certainly something like a KDL database available.

 

but for now a dataflow is required 

 

Anonymous
Not applicable

Hi @slimbnsalah ,

Thanks for using Fabric Community.
At this time, we are reaching out to the internal team to get some help on this .
We will update you once we hear back from them.

Anonymous
Not applicable

Hi @slimbnsalah ,

We have an update from internal team -

"My Problem is that data is so huge that it exceeds the limits of what the API can handle" - they would need to work with their data source provider (AppInsights) if they are exceeding their API limits at their tier and/or look to see if there is ways to to control the timeframe / supply parameters within their request to stay within the APIs limits.


Hope this is helpful. Please let me know incase of further queries.

Anonymous
Not applicable

HI @slimbnsalah ,

You can also use pipelines rest connector with pagination settings to iterate over all the data returned by AppInsights through APIs, and use Lakehouse as destination. You shouldnt run into any API limits in this case. Let me know incase if you have any further queries.

use pipelines rest connector with pagination settings

Rest connector is not yet supported in Dataflow Gen2, Source: https://learn.microsoft.com/en-us/fabric/data-factory/connector-rest-overview

Even with that, how will the API handle data more than it can return?

Anonymous
Not applicable

Hi @slimbnsalah ,

You should be able to use rest connector activity in data factory pipelines. With the help of pagination settings to iterate over all the data returned by AppInsights through APIs, and use Lakehouse as destination. You shouldnt run into any API limits in this case. 

vgchennamsft_0-1714031142982.png

 

Links to refer - 
Pagination Rules in Rest Connector 

I hope this is helpful. 

Anonymous
Not applicable

Hi @slimbnsalah ,

We haven’t heard from you on the last response and was just checking back to see if you got some insights.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Anonymous
Not applicable

Hi @slimbnsalah ,

We haven’t heard from you on the last response and was just checking back to see if you got some insights.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

AugFabric_Carousel

Fabric Monthly Update - August 2024

Check out the August 2024 Fabric update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.