Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello everyone,
I have a table in the lakehouse with 1.5 billion rows at a minute-level granularity. I can't create a report due to the direct lake limitation. What do you think would be the best way to approach this from the desktop?
Any reference links I could read on the subject?
Thanks
Solved! Go to Solution.
Hi @rgsalido
Direct Lake does support very large tables, but it still has hard guardrails that depend on the Fabric capacity SKU. Even on high SKUs (F64/F128), a fact table with 1.5B minute-level rows is at the edge of the supported limit and often won’t load in Direct Lake. When the engine detects too many rowgroups/files or an oversized table, it will either fall back to DirectQuery or simply refuse to load if fallback is disabled.
For this type of workload (telemetry / high-frequency events), the recommended pattern is:
Create an aggregated table (hourly/daily) and use that table in your Direct Lake semantic model.
Keep the raw 1.5B-row minute-level table in the lakehouse and access it via DirectQuery only for detailed pages or drill-through.
Optionally keep dimensions in Import for better performance (composite model).
This pattern aligns with Microsoft’s guidance for Direct Lake: use it for analytical models and aggregated facts, not for raw high-granularity event streams.
So even with Fabric capacity, the best approach is:
Direct Lake for the aggregated table + DirectQuery for the raw detail table.
If this post helps, then please consider Accepting it as the solution to help the other members find it more quickly
We are using an F64 capacity. I am going to test in DQ mode and create aggregated tables.
Hi @rgsalido,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @rgsalido,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
@Ritaf1983 & @amitchandak Thanks for your prompt response
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @rgsalido
Direct Lake does support very large tables, but it still has hard guardrails that depend on the Fabric capacity SKU. Even on high SKUs (F64/F128), a fact table with 1.5B minute-level rows is at the edge of the supported limit and often won’t load in Direct Lake. When the engine detects too many rowgroups/files or an oversized table, it will either fall back to DirectQuery or simply refuse to load if fallback is disabled.
For this type of workload (telemetry / high-frequency events), the recommended pattern is:
Create an aggregated table (hourly/daily) and use that table in your Direct Lake semantic model.
Keep the raw 1.5B-row minute-level table in the lakehouse and access it via DirectQuery only for detailed pages or drill-through.
Optionally keep dimensions in Import for better performance (composite model).
This pattern aligns with Microsoft’s guidance for Direct Lake: use it for analytical models and aggregated facts, not for raw high-granularity event streams.
So even with Fabric capacity, the best approach is:
Direct Lake for the aggregated table + DirectQuery for the raw detail table.
If this post helps, then please consider Accepting it as the solution to help the other members find it more quickly
@rgsalido , What is capacity are you using? I think F64 onward, you should not have an issue
You can use direct Query mode. Because import will also work best F64 onward in this case
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 160 | |
| 132 | |
| 117 | |
| 79 | |
| 53 |