Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello.
I am using FT1 trial capacity in Fabric. I created a spark notebook -> lakehouse -> powerBI (DirectLake connection) report on a Workspace.
I created 2 large fact tables (Each table has about 60M rows and 10-11 columns) and 7-8 dimension tables in Lakehouse and I connect to these tables with direct lake and report.
I also track CU usage via Metrics App.
On 2025-05-30, my teammate and I connected to the report for the trial and applied filters to some pages. (For CU usage and report performance testing).
We applied filters to each page and closed the report. CU usage was much higher than we expected. I am sharing it below.
Is this usage normal? Only Query used about 200k CU in test usage.
Solved! Go to Solution.
Hi @tolgakurt ,
Thank you for reaching out to Microsoft Community.
Even though DirectLake is designed to minimize compute by directly reading from OneLake, certain factors can cause significant CU usage.
The first time a report is accessed, Power BI builds cache, loads metadata, and runs all visual queries, even hidden or non-visible visuals (tooltips, drill-through pages, etc.).
If visuals use non-aggregated or complex DAX, more data is scanned and if multiple pages are filtered and visuals are refreshed, this can be equivalent to 50–100+ queries, depending on the page complexity.
Ensure no unexpected refreshes or background operations were running.
As long as you stay within your assigned capacity tier, there’s no extra cost beyond your base SKU rate. Monitoring daily average usage is still a good practice for long-term planning and sizing, especially when moving beyond the trial.
If your usage seems excessive compared to expected workloads, you might want to analyze query execution times and optimize your data model. You can find more details on Fabric trial capacity here Fabric trial capacity - Microsoft Fabric | Microsoft Learn
Understand the metrics app compute page - Microsoft Fabric | Microsoft Learn
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Hi
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Direct lake use PowerBI vCPU, which consumes 8 CUs per second per vCPU second
Hi @tolgakurt ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Hi @tolgakurt ,
May I ask if you have gotten this issue resolved?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Regards,
Chaithra.
Hi @tolgakurt ,
As we haven’t heard back from you, so just following up to our previous message. I'd like to confirm if you've successfully resolved this issue or if you need further help.
If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. If you find a reply particularly helpful to you, you can also mark it as a solution.
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Chaithra E.
Hi @tolgakurt ,
Thank you for reaching out to Microsoft Community.
Even though DirectLake is designed to minimize compute by directly reading from OneLake, certain factors can cause significant CU usage.
The first time a report is accessed, Power BI builds cache, loads metadata, and runs all visual queries, even hidden or non-visible visuals (tooltips, drill-through pages, etc.).
If visuals use non-aggregated or complex DAX, more data is scanned and if multiple pages are filtered and visuals are refreshed, this can be equivalent to 50–100+ queries, depending on the page complexity.
Ensure no unexpected refreshes or background operations were running.
As long as you stay within your assigned capacity tier, there’s no extra cost beyond your base SKU rate. Monitoring daily average usage is still a good practice for long-term planning and sizing, especially when moving beyond the trial.
If your usage seems excessive compared to expected workloads, you might want to analyze query execution times and optimize your data model. You can find more details on Fabric trial capacity here Fabric trial capacity - Microsoft Fabric | Microsoft Learn
Understand the metrics app compute page - Microsoft Fabric | Microsoft Learn
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Hi @tolgakurt ,
To make TargetDayRevenueSCG dynamic based on the SalesChannelGroup, you can use a SWITCH statement that evaluates the current context. Here's a clean way to do it:
TargetDayRevenueSCG =
VAR DaysInMonth = MAX(DIM_DATE[NumberOfDaysInTheMonth])
RETURN
SWITCH(
SELECTEDVALUE(SalesChannelGroup),
"Leisure", SUM(TTVBudget[LeisureRevenue]) * DaysInMonth,
"MICE/Intelligence", SUM(TTVBudget[MICEResidential]) * DaysInMonth,
"Travel/Intelligence", SUM(TTVBudget[TransientRevenue]) * DaysInMonth,
BLANK()
)Make sure:
If you're using SUMX for row-level logic, let me know and I can help adjust the measure accordingly.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.