Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
vishesh_singhal
Regular Visitor

how to get logs from dataflow gen2 and data pipeline to log analytics

Hello , how do i configure log analytics to get all details stroed in monitor hub to instead be added to log analytics for dataflow gen2 and data pipelines ?

1 ACCEPTED SOLUTION
v-prasare
Community Support
Community Support

Hi @vishesh_singhal,

 

To help you monitor Dataflow Gen2 and Data Pipelines more effectively in Microsoft Fabric, you can configure diagnostic settings that send logs from the Monitor Hub directly to an Azure Log Analytics workspace. By default, Monitor Hub captures rich metadata such as pipeline runs, dataflow executions, and error statuses, but this data isn't automatically exported to Log Analytics unless configured explicitly. 

 

prerequisites: The first step is ensuring prerequisites are met you’ll need to be a Fabric Admin, have an Azure subscription with a Log Analytics workspace, and be using a paid Fabric capacity (as diagnostic settings are only supported in non-trial environments). Once confirmed

configuration: open the Azure Portal and locate your Microsoft Fabric capacity resource. If the resource isn’t listed (as some tenants may not expose Fabric capacity directly yet), you can alternatively configure diagnostics at the Power BI workspace level using the Admin Portal.

Next, navigate to Azure Monitor > Diagnostic settings and select your Fabric or Power BI resource. Create a new diagnostic setting, select log categories such as DataflowActivity, pipelineRun, and PipelineActivityRun, then choose your Log Analytics workspace as the destination. After saving this configuration, logs related to Dataflow Gen2 and Pipelines will begin streaming into your Log Analytics workspace, where they can be queried and analyzed.

 

 

 

 

Thanks,

Prashanth

MS Fabric community support

View solution in original post

4 REPLIES 4
v-prasare
Community Support
Community Support

@vishesh_singhal As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

@vishesh_singhal As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

Hi @vishesh_singhal,

 

To help you monitor Dataflow Gen2 and Data Pipelines more effectively in Microsoft Fabric, you can configure diagnostic settings that send logs from the Monitor Hub directly to an Azure Log Analytics workspace. By default, Monitor Hub captures rich metadata such as pipeline runs, dataflow executions, and error statuses, but this data isn't automatically exported to Log Analytics unless configured explicitly. 

 

prerequisites: The first step is ensuring prerequisites are met you’ll need to be a Fabric Admin, have an Azure subscription with a Log Analytics workspace, and be using a paid Fabric capacity (as diagnostic settings are only supported in non-trial environments). Once confirmed

configuration: open the Azure Portal and locate your Microsoft Fabric capacity resource. If the resource isn’t listed (as some tenants may not expose Fabric capacity directly yet), you can alternatively configure diagnostics at the Power BI workspace level using the Admin Portal.

Next, navigate to Azure Monitor > Diagnostic settings and select your Fabric or Power BI resource. Create a new diagnostic setting, select log categories such as DataflowActivity, pipelineRun, and PipelineActivityRun, then choose your Log Analytics workspace as the destination. After saving this configuration, logs related to Dataflow Gen2 and Pipelines will begin streaming into your Log Analytics workspace, where they can be queried and analyzed.

 

 

 

 

Thanks,

Prashanth

MS Fabric community support

Hi @v-prasare , this response is mis-leading, I don't think Fabric Capacity provide an option to send the diagnostic logs into Log Analytics workspace from the Azure Portal directly.

It can send logs when Workspace is assigned a "Log Analytics" workspace and here there are limitations, it can't send logs for all the artifacts. Right I think it supports only Power BI and few other (I don't have the list). Pipeline is not part of it. Please double check the reply and point us to documentation if anything such available.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Fabric Update Carousel

Fabric Monthly Update - October 2025

Check out the October 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.