Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
is there another way to extract to user consumption on CU WITHOUT using a 30-second timepoint?
I have read all the examples, for example from PBIGuy (using DAX to TimePointInteractiveDetail). But this is not really efficient and takes alot of time.
Is it possible to extract the user consumption on user level on a daily basis in another way?
Thanks in advanced.
Hi @amien ,
Thanks for reaching out to the Microsoft fabric community forum.
The blog post linked below provides a helpful guide on extracting data from the Fabric Capacity Metrics App using tools like DAX Studio and SQL Server Profiler. I hope you find it useful!
How to extract data from the Fabric Metrics App – Part 1 – PBI Guy
Please go through the below solved link which may help you in resolving the issue
Solved: How to Download the Fabric Capacity Metrics Data - Microsoft Fabric Community
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
Thanks for your answer .. perhaps you didn't read my full question? I know the PBI Guy option .. but the 30 second timeframe need to extract the data is just not feasable if you want to extract one day of data. You need to fire like 1800 DAX queries to get a full day. And then need to loop through all the capacities. Do you have an alternative to avoud the 30 second timeframe?
Hi @amien ,
Extracting a full day's worth of data using those 30-second intervals is definitely not practical, especially when you're dealing with multiple capacities and users. I completely understand how frustrating that can be.
Unfortunately, at this time, there isn’t a built-in or officially supported way to get daily-level user CU consumption without relying on the TimePointInteractiveDetail granularity. I know it’s not ideal.
It might be worth raising a support ticket with Microsoft. They may be able to advise if there are any internal tools, APIs in preview, or workarounds suited to your scenario.
Please refer below link on how to raise a contact support or support ticket.
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
This approach uses Fabric's Real-Time Analytics capabilities to efficiently query and aggregate the raw log data.
The Concept
The Capacity Metrics app is simply a Power BI report connected to a dataset. That dataset is populated by logs that record every activity in your Fabric tenant. By using a KQL Queryset, you can directly query these source logs, create the exact daily summary you need, and then load that small, pre-aggregated result into a new Power BI dataset.
Step-by-Step Guide
Create a KQL Queryset:
In your Fabric workspace, select the Data Engineer or Real-Time Analytics persona.
Create a new KQL Queryset. This is a code editor where you will write your query.
Query the Fabric Logs:
A KQL Queryset can query across databases. In the explorer pane, you'll need to find the database containing the Fabric logs. This is typically available as a read-only source.
Write a KQL query to aggregate the data. KQL is a language purpose-built for slicing and dicing log and time-series data with incredible speed.
Example KQL Query
Here is a sample query you can adapt. This query finds all interactive CU usage for the past 7 days and summarizes it by user and day.
// This query directly accesses the Fabric log data for CU consumption.
// It is significantly more performant than using DAX on the metrics app dataset.
// Replace 'FabricMetriosLog' with the actual name of your tenant's log database if different.
database('FabricMetricsLog').CapacityMetrics
| where TimeGenerated >= ago(7d) // Filter to a specific time range for efficiency.
| where OperationName has "Interactive" // Focus only on interactive operations (e.g., Power BI report loads).
| where isnotempty(UserPrincipalName) // Ensure we only count operations attributed to a user.
| summarize
// Sum the consumption, converting it to a double for accuracy.
TotalCU_Seconds = sum(todouble(TotalCUSeconds)),
// Count the distinct operations for context.
OperationCount = count()
// Group by the user's UPN and the day. The bin() function rounds the timestamp down to the start of the day.
by UserPrincipalName, bin(TimeGenerated, 1d)
| order by UserPrincipalName asc, TimeGenerated desc
If this explanation and solution resolve your issue, please like and accept the solution.
Thanks for your answer .. How do i add the Fabric database to the KQL query set?
This option doesn't require Workspace monitoring to be active right?
Check out the July 2025 Power BI update to learn about new features.
User | Count |
---|---|
73 | |
72 | |
38 | |
31 | |
26 |