The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
Is there any default generated report generated for the end user usage of specific lakehouses in Fabric?
E.g, I want to be able to see which tables/views has been queried by end users for specific lakehouses in Fabric.
Solved! Go to Solution.
Hi, @Barre
As of now, Microsoft Fabric does not provide a built-in, default-generated report specifically for monitoring end user usage of specific lakehouses, such as tracking which tables or views have been queried by end users. However, you can achieve this level of monitoring and create custom reports by leveraging Azure Monitor, Log Analytics, and Power BI.
Here's how you can set this up:
Enable Diagnostic Settings on Your Lakehouse: Go to the Azure portal and navigate to your Fabric Lakehouse. Set up diagnostic settings to send logs to Azure Monitor or Log Analytics. This will capture activity logs, including query execution logs.
Configure Log Analytics: If you don't have a Log Analytics workspace, create one in the Azure portal. Link your Fabric Lakehouse to the Log Analytics workspace through the diagnostic settings. Ensure that you are capturing the relevant logs (e.g., Audit logs, Query logs).
Query Logs in Log Analytics: Go to the Log Analytics workspace and use Kusto Query Language (KQL) to query the logs. You can create queries to filter and analyze the logs to find information about which tables or views have been queried.
Example KQL query to get the count of queries per table:
AzureDiagnostics
| where ResourceType == "FABRIC_LAKEHOUSE"
| where OperationName == "Query"
| summarize Count = count() by TableName, bin(TimeGenerated, 1d)
Best Regards,
hackcrr
If this post helps, then please consider Accept it as the solution and kudos to this post to help the other members find it more quickly
Hi @Barre ,
I think @hackcrr and @SaiTejaTalasila answered correctly, and I need to add that Fabric itself doesn't generate a default report specifically for end-users using lakes, which will directly show queried tables/views.
So to achieve visibility into end-user interactions with the lake, you might consider leveraging the integrated analytics and monitoring tools supported by Fabric. Leverage Power BI to connect to Lakehouse data and create custom reports Power BI has the ability to visualize data access patterns, which can be leveraged by creating reports that record queries executed against the lake. You can read this document: Lakehouse tutorial - Build a report - Microsoft Fabric | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
Thank you for the response.
I will have a closer look into the documentation and consider which solution to go with, regarding setting this up - much appreciated!
Hi, @Barre
As of now, Microsoft Fabric does not provide a built-in, default-generated report specifically for monitoring end user usage of specific lakehouses, such as tracking which tables or views have been queried by end users. However, you can achieve this level of monitoring and create custom reports by leveraging Azure Monitor, Log Analytics, and Power BI.
Here's how you can set this up:
Enable Diagnostic Settings on Your Lakehouse: Go to the Azure portal and navigate to your Fabric Lakehouse. Set up diagnostic settings to send logs to Azure Monitor or Log Analytics. This will capture activity logs, including query execution logs.
Configure Log Analytics: If you don't have a Log Analytics workspace, create one in the Azure portal. Link your Fabric Lakehouse to the Log Analytics workspace through the diagnostic settings. Ensure that you are capturing the relevant logs (e.g., Audit logs, Query logs).
Query Logs in Log Analytics: Go to the Log Analytics workspace and use Kusto Query Language (KQL) to query the logs. You can create queries to filter and analyze the logs to find information about which tables or views have been queried.
Example KQL query to get the count of queries per table:
AzureDiagnostics
| where ResourceType == "FABRIC_LAKEHOUSE"
| where OperationName == "Query"
| summarize Count = count() by TableName, bin(TimeGenerated, 1d)
Best Regards,
hackcrr
If this post helps, then please consider Accept it as the solution and kudos to this post to help the other members find it more quickly
Hi @hackcrr ,
Thank you for the response.
We already have this process set up, but our KQL database only monitor semantic model activity, which is just a filter that can be adjusted I believe to include other artifacts.
I will explore this in more depth - thank you for good detailed answer.
Hi,
You can use fabric capacity metrics to get those details or you can refer this -
https://microsoftlearning.github.io/mslearn-fabric/Instructions/Labs/06c-monitor-data-warehouse.html
Hi @SaiTejaTalasila ,
Thank you for the response - I will have a look at the documentation.
User | Count |
---|---|
36 | |
15 | |
11 | |
11 | |
8 |
User | Count |
---|---|
44 | |
44 | |
19 | |
18 | |
17 |