Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
AJAJ
Helper III
Helper III

query to find Size of each FABRIC Warehouse table

Hi,

 

Currently on Azure SQL. At the end of ETL process, all SQL table sizes are recorded into a table by using a query.

 

When someone moves into Fabric warehouse or lakehouse, how to capture sizes of all warehouse tables or if lakehouse, file sizes using a query? 

 

Appreciate your thoughts.

1 ACCEPTED SOLUTION
v-veshwara-msft
Community Support
Community Support

Hi @AJAJ ,

Thanks for reaching out to Microsoft Fabric Community.

In Microsoft Fabric, the approach depends on whether the data is in a Warehouse or a Lakehouse.

For Fabric Warehouse, you can run the query below.

SELECT
    t.name AS table_name,
    SUM(p.rows) AS row_count,
    CAST(SUM(a.total_pages) * 8 AS DECIMAL(18,2)) AS size_in_KB,
    CAST(SUM(a.total_pages) * 8.0 / 1024 AS DECIMAL(18,2)) AS size_in_MB,
    CAST(SUM(a.total_pages) * 8.0 / 1024 / 1024 AS DECIMAL(18,2)) AS size_in_GB
FROM sys.tables t
JOIN sys.indexes i 
    ON t.object_id = i.object_id
JOIN sys.partitions p 
    ON i.object_id = p.object_id 
   AND i.index_id = p.index_id
JOIN sys.allocation_units a 
    ON p.partition_id = a.container_id
WHERE t.name = 'SalesSample'
  AND SCHEMA_NAME(t.schema_id) = 'dbo'
GROUP BY t.name;

 Replace table name and schema with your table and schema names.

For Fabric Lakehouse, table sizes are stored as Delta metadata and can be retrieved using Spark SQL. From a notebook attached to the Lakehouse, run the following in a SQL cell:

DESCRIBE DETAIL SalesSample;

This returns sizeInBytes, row count, and other table metadata. The value can be converted to KB, MB, or GB and persisted as part of ETL logging if required.

 

For Lakehouse files (for example files under the Files area), size information can be viewed directly from the Lakehouse UI, where file and folder sizes are shown. This is currently the supported way to check file level sizes without using Spark.

 

Similar threads and useful resources: 

Solved: Re: Size table in workspace - Microsoft Fabric Community

How to check the size of data in lakehouse - Microsoft Fabric Community

Getting the size of OneLake data items or folders | Microsoft Fabric Blog | Microsoft Fabric

 

Hope this helps. Please reach out for further assistance.

Thank you.

 

View solution in original post

4 REPLIES 4
v-veshwara-msft
Community Support
Community Support

Hi @AJAJ ,

Just checking in to see if you query is resolved and if any responses were helpful.
Otherwise, feel free to reach out for further assistance.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @AJAJ ,

Thanks for reaching out to Microsoft Fabric Community.

In Microsoft Fabric, the approach depends on whether the data is in a Warehouse or a Lakehouse.

For Fabric Warehouse, you can run the query below.

SELECT
    t.name AS table_name,
    SUM(p.rows) AS row_count,
    CAST(SUM(a.total_pages) * 8 AS DECIMAL(18,2)) AS size_in_KB,
    CAST(SUM(a.total_pages) * 8.0 / 1024 AS DECIMAL(18,2)) AS size_in_MB,
    CAST(SUM(a.total_pages) * 8.0 / 1024 / 1024 AS DECIMAL(18,2)) AS size_in_GB
FROM sys.tables t
JOIN sys.indexes i 
    ON t.object_id = i.object_id
JOIN sys.partitions p 
    ON i.object_id = p.object_id 
   AND i.index_id = p.index_id
JOIN sys.allocation_units a 
    ON p.partition_id = a.container_id
WHERE t.name = 'SalesSample'
  AND SCHEMA_NAME(t.schema_id) = 'dbo'
GROUP BY t.name;

 Replace table name and schema with your table and schema names.

For Fabric Lakehouse, table sizes are stored as Delta metadata and can be retrieved using Spark SQL. From a notebook attached to the Lakehouse, run the following in a SQL cell:

DESCRIBE DETAIL SalesSample;

This returns sizeInBytes, row count, and other table metadata. The value can be converted to KB, MB, or GB and persisted as part of ETL logging if required.

 

For Lakehouse files (for example files under the Files area), size information can be viewed directly from the Lakehouse UI, where file and folder sizes are shown. This is currently the supported way to check file level sizes without using Spark.

 

Similar threads and useful resources: 

Solved: Re: Size table in workspace - Microsoft Fabric Community

How to check the size of data in lakehouse - Microsoft Fabric Community

Getting the size of OneLake data items or folders | Microsoft Fabric Blog | Microsoft Fabric

 

Hope this helps. Please reach out for further assistance.

Thank you.

 

Hi @AJAJ ,
Just wanted to check if the response provided was helpful. If further assistance is needed, please reach out.
Thank you.

AJAJ
Helper III
Helper III

Hi,

 

Currently on Azure SQL. At the end of ETL process, all SQL table sizes are recorded into a table by using a query.

 

When someone moves into Fabric lakehouse, how to capture sizes of ALL lakehouse file sizes using a SQL query (preferred) if not / python script in notebook ? 

 

Appreciate your thoughts.

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.