Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
gp10
Advocate III
Advocate III

Lakehouse Table Rows Limit

Hi community,
I wanted to ask opinions on how to handle the table row limit of a Fabric Lakehouse.

For a P1 capacity the table row limit is 1.5 bn rows. That is pretty low for a feature that claims to work with big data.

Are there any recommendations on the scenario where our data exceed this limit?

  • If we have a multibillion row table, how to we deal with it?
  • If it is our fact table?
  • If we break it into smaller tables, will we then be able to create DAX measures to calculate metrics that may need data from all these tables? Like a total sum or count?
  • How the performance will be?

 Would love to hear your thoughts.
Thanks.

1 ACCEPTED SOLUTION
AndyDDC
Most Valuable Professional
Most Valuable Professional

Ok thanks for confirming.  At the moment yes that is a limit as the directlake feature is (attempting to) paging all the rows into the vertipaq engine cache from the lakehouse table.  I don't know whether it is exactly 1.5B rows as there could be a little variance.  However if your Fact table is well above that, then yes it may need splitting if you want DirectLake.

 

You should be able to use a measure to Sum/Count across multiple tables and it should use directlake

 

AndyDDC_0-1706101777767.png

 

You can use Profiler to actually see if a query is using DirectLake or falling back to DQ in your testing.

Learn how to analyze query processing for Direct Lake datasets - Power BI | Microsoft Learn

View solution in original post

4 REPLIES 4
AndyDDC
Most Valuable Professional
Most Valuable Professional

Hi @gp10, the row limit of 1.5B rows is based on the DirectLake functionality - is this what you are referring to?  The Lakehouse table itself can hold more, it's just that the DirectLake feature will not work with tables > 1.5B rows and will fall back to DirectQuery.

Thanks @AndyDDC , yes this is what I'm referring to.
In this case, what options do I have if I want to use Direct Lake with tables more than 1.5b rows?
Is there a hard limit on the Lakehouse table rows?

Cheers.

AndyDDC
Most Valuable Professional
Most Valuable Professional

Ok thanks for confirming.  At the moment yes that is a limit as the directlake feature is (attempting to) paging all the rows into the vertipaq engine cache from the lakehouse table.  I don't know whether it is exactly 1.5B rows as there could be a little variance.  However if your Fact table is well above that, then yes it may need splitting if you want DirectLake.

 

You should be able to use a measure to Sum/Count across multiple tables and it should use directlake

 

AndyDDC_0-1706101777767.png

 

You can use Profiler to actually see if a query is using DirectLake or falling back to DQ in your testing.

Learn how to analyze query processing for Direct Lake datasets - Power BI | Microsoft Learn

Thanks for your response @AndyDDC.

I will accept it as a solution, so far no other alternatives (other than upgrading the capacity) to use Direct Lake with such a table.

 

In your opinion, how would the performance be if I use such measures?
And performance wise, when splitting the table, does it make sense to create few big tables that will try to approach the 1.5bn row limit, or create many smaller ones?

Thanks for your time.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.