The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
is there another way to extract to user consumption on CU WITHOUT using a 30-second timepoint?
I have read all the examples, for example from PBIGuy (using DAX to TimePointInteractiveDetail). But this is not really efficient and takes alot of time.
Is it possible to extract the user consumption on user level on a daily basis in another way?
Thanks in advanced.
Hi .. no not really .. they keep saying query to TimePointInteractiveDetail is not supported. Which i understand. But they also don't give me an alternative to get the data. They redirect me to the Chargeback report .. I know i can open the report and handpick a specific case to see the data .. but i want to store the whole dataset on a lakehouse.
Hi @amien ,
To assist you further with this issue, could you please share the support ticket number related to your request.
Thank you.
TrackingID#2507231420003248
Hi @amien ,
Thanks for reaching out to the Microsoft fabric community forum.
We have contacted the support team related to your issue, the support team mentioned that your issue will get resolved soon.
Please continue using Fabric Community Forum.
Hi @amien ,
Thanks for reaching out to the Microsoft fabric community forum.
Please go through the below solved link which may help you in resolving the issue
Solved: How to Download the Fabric Capacity Metrics Data - Microsoft Fabric Community
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
Thanks for your answer .. perhaps you didn't read my full question? I know the PBI Guy option .. but the 30 second timeframe need to extract the data is just not feasable if you want to extract one day of data. You need to fire like 1800 DAX queries to get a full day. And then need to loop through all the capacities. Do you have an alternative to avoud the 30 second timeframe?
Hi @amien ,
Extracting a full day's worth of data using those 30-second intervals is definitely not practical, especially when you're dealing with multiple capacities and users. I completely understand how frustrating that can be.
Unfortunately, at this time, there isn’t a built-in or officially supported way to get daily-level user CU consumption without relying on the TimePointInteractiveDetail granularity. I know it’s not ideal.
It might be worth raising a support ticket with Microsoft. They may be able to advise if there are any internal tools, APIs in preview, or workarounds suited to your scenario.
Please refer below link on how to raise a contact support or support ticket.
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
I will connect with support
This is also not a solution right? same problem : https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/execute-queries
Hi @amien ,
Thank you for the update. Once you’ve had a chance to connect with the support team, feel free to share any updates here.
Hi @amien ,
We’d like to follow up on your support ticket. Has the issue been successfully addressed?
If a solution was provided, we’d truly appreciate it if you could share your experience or any relevant details, your input could be valuable to other community members.
Thanks again for being an active part of the Microsoft Fabric Community!
This approach uses Fabric's Real-Time Analytics capabilities to efficiently query and aggregate the raw log data.
The Concept
The Capacity Metrics app is simply a Power BI report connected to a dataset. That dataset is populated by logs that record every activity in your Fabric tenant. By using a KQL Queryset, you can directly query these source logs, create the exact daily summary you need, and then load that small, pre-aggregated result into a new Power BI dataset.
Step-by-Step Guide
Create a KQL Queryset:
In your Fabric workspace, select the Data Engineer or Real-Time Analytics persona.
Create a new KQL Queryset. This is a code editor where you will write your query.
Query the Fabric Logs:
A KQL Queryset can query across databases. In the explorer pane, you'll need to find the database containing the Fabric logs. This is typically available as a read-only source.
Write a KQL query to aggregate the data. KQL is a language purpose-built for slicing and dicing log and time-series data with incredible speed.
Example KQL Query
Here is a sample query you can adapt. This query finds all interactive CU usage for the past 7 days and summarizes it by user and day.
// This query directly accesses the Fabric log data for CU consumption.
// It is significantly more performant than using DAX on the metrics app dataset.
// Replace 'FabricMetriosLog' with the actual name of your tenant's log database if different.
database('FabricMetricsLog').CapacityMetrics
| where TimeGenerated >= ago(7d) // Filter to a specific time range for efficiency.
| where OperationName has "Interactive" // Focus only on interactive operations (e.g., Power BI report loads).
| where isnotempty(UserPrincipalName) // Ensure we only count operations attributed to a user.
| summarize
// Sum the consumption, converting it to a double for accuracy.
TotalCU_Seconds = sum(todouble(TotalCUSeconds)),
// Count the distinct operations for context.
OperationCount = count()
// Group by the user's UPN and the day. The bin() function rounds the timestamp down to the start of the day.
by UserPrincipalName, bin(TimeGenerated, 1d)
| order by UserPrincipalName asc, TimeGenerated desc
If this explanation and solution resolve your issue, please like and accept the solution.
Thanks for your answer .. How do i add the Fabric database to the KQL query set?
This option doesn't require Workspace monitoring to be active right?
User | Count |
---|---|
78 | |
77 | |
38 | |
31 | |
28 |
User | Count |
---|---|
106 | |
98 | |
55 | |
49 | |
47 |