Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
Manoshi
Frequent Visitor

Unexpectedly High Storage Usage in Fabric Workspace

Hi everyone,

One of my Fabric workspaces is showing a current storage of  838 GB in Fabric capacity metrics, which seems unusually high. I only have one warehouse and a few semantic models.

 

  • The semantic models together occupy about 43 MB. (checked with manage group storage)
  • The warehouse contains around 160 tables, most of which have fewer than 100K rows, and only a few have 1 - 5 million rows.

Could someone please help me understand what might be contributing to this high storage, and how I can view a detailed breakdown of the storage usage?

 

Thanks in advance for your support!

1 ACCEPTED SOLUTION

Hi @Manoshi , Thank you for reaching out to the Microsoft Community Forum.

 

Fabric currently doesn’t expose warehouse level storage breakdowns directly in the UI or metrics app. Since your workspace only contains a Warehouse, the 838 GB usage must come from the underlying OneLake folder that backs it. Each warehouse in Fabric is actually a Delta Lake folder inside OneLake, which stores not just data files but historical snapshots, transaction logs and temporary/staging data created during refreshes or DML operations.

 

You can open a Fabric notebook (Spark or Python) and navigate to the OneLake path for your warehouse. Use filesystem commands like dbutils.fs.ls() or os.walk() to list subfolder sizes under /Tables/ or /delta_log/. This gives you a per table view of which ones are consuming the most space. Large delta_log directories or multiple versions often point to accumulated table history, running VACUUM (to remove old snapshots) and OPTIMIZE (to compact small files) can reclaim much of this space.

 

If these still didn’t help you with the issue, then the next best course is to raise a Microsoft Support ticket, as this would need to be resolved on their end.

 

Below is the link to help create Microsoft Support ticket:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

View solution in original post

7 REPLIES 7
rohit1991
Super User
Super User

Hi @Manoshi 

Possible Cause: 

 

The high storage is likely due to hidden system or intermediate data in your Fabric warehouse - not just table row counts. Fabric keeps:

  • Delta Lake files and versions (every update/apply keeps snapshots).

  • Auto-created staging and temp files from data movement or ETL runs.

  • Failed or old pipeline runs that didn’t clean up their storage.

  • Index or caching layers used by the warehouse engine.

What you can do :

 

1. Check “OneLake data hub >> Manage storage” >> see which lakehouse/warehouse folders are largest.

In the warehouse, run:

SHOW TABLE STORAGE;

to view table-level size usage.

 

2. Clean up unused tables, temp datasets, and old snapshots.

 

3. If sizes look incorrect, open a Microsoft support ticket - some users report inflated Fabric capacity metrics.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

Hi @rohit1991 ,

Thank you so much for your detailed explanation, it was really helpful! 

However, I noticed a few differences in the current Fabric interface:

  • The OneLake Data Hub seems to have been replaced by the OneLake Catalog, and I couldn’t find a 'Manage storage' option there.

  • Also, the SHOW TABLE STORAGE; syntax doesn’t seem to work directly in the warehouse (it’s not recognized in the current SQL endpoint).

Could you please confirm if there’s an updated way to check storage usage per fabric items in the latest Fabric experience?

Thanks again for your guidance!

Hi @Manoshi , Thank you for reaching out to the Microsoft Community Forum.

 

Your Fabric workspace showing 838 GB is almost certainly due to hidden or historical data sitting in OneLake, which underpins all Fabric items. Even if your warehouse tables and semantic models seem small, OneLake stores not just current table data but also Delta snapshots, temp or staging files from pipelines and cached data. Over time, this can easily grow into hundreds of gigabytes if not maintained.

 

Since the old OneLake Data Hub and Manage Storage options are gone, the right way now is to start with the Fabric Capacity Metrics App. It’s the official, up to date tool that shows storage by workspace and by Fabric item. Once you confirm which workspace is consuming most capacity, move to OneLake Catalog to locate the associated Lakehouse or Warehouse folders. The Catalog doesn’t expose size metrics directly, so use a Fabric notebook to list file sizes under those OneLake paths, that’s where you’ll see which folders or tables hold large Delta data.

 

If you find large Delta folders, the solution is to perform table maintenance, compact and optimize them, then vacuum old snapshots you don’t need. Fabric now documents this clearly under Lakehouse Table Maintenance and Delta Optimization & V-Order. These operations safely reduce small file clutter and remove obsolete data versions, often cutting storage drastically without affecting live data.

 

Finally, review Capacity Settings in the Fabric admin portal to confirm retention and caching behavior and compare your OneLake folder sizes with what the Capacity Metrics App reports.

 

Install the Microsoft Fabric capacity metrics app - Microsoft Fabric | Microsoft Learn

OneLake catalog overview - Microsoft Fabric | Microsoft Learn

Delta table maintenance in Microsoft Fabric - Microsoft Fabric | Microsoft Learn

Delta Lake table optimization and V-Order - Microsoft Fabric | Microsoft Learn

Delta Table optimize/vacuum - Microsoft Q&A

Manage your Fabric capacity - Microsoft Fabric | Microsoft Learn

OneLake in Microsoft Fabric documentation - Microsoft Fabric | Microsoft Learn

Hi @v-hashadapu ,

Thank you so much for your detailed and informative explanation. This was really helpful. The guidance on OneLake storage behavior and Delta optimization is very useful, and I'll definitely apply those best practices for my Lakehouses going forward.

 

However, in my current case, the workspace only contains a Warehouse (no Lakehouse), and that’s what shows around 838 GB in storage. Since the Fabric Capacity Metrics App currently shows storage only at the workspace level, could you please advise on how I can troubleshoot or drill down to see storage usage?

 

Thanks again for the clear and thorough explanation

Hi @Manoshi , Thank you for reaching out to the Microsoft Community Forum.

 

Fabric currently doesn’t expose warehouse level storage breakdowns directly in the UI or metrics app. Since your workspace only contains a Warehouse, the 838 GB usage must come from the underlying OneLake folder that backs it. Each warehouse in Fabric is actually a Delta Lake folder inside OneLake, which stores not just data files but historical snapshots, transaction logs and temporary/staging data created during refreshes or DML operations.

 

You can open a Fabric notebook (Spark or Python) and navigate to the OneLake path for your warehouse. Use filesystem commands like dbutils.fs.ls() or os.walk() to list subfolder sizes under /Tables/ or /delta_log/. This gives you a per table view of which ones are consuming the most space. Large delta_log directories or multiple versions often point to accumulated table history, running VACUUM (to remove old snapshots) and OPTIMIZE (to compact small files) can reclaim much of this space.

 

If these still didn’t help you with the issue, then the next best course is to raise a Microsoft Support ticket, as this would need to be resolved on their end.

 

Below is the link to help create Microsoft Support ticket:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

Hi @v-hashadapu ,

Thank you so much for your continuous guidance. This is really helpful and valuable information! I'll try out these steps to check the OneLake folder and optimize the storage. If the issue still remains, I'll go ahead and raise a support ticket as you suggested.

 

Appreciate your help!

Hi @Manoshi , Thanks for the update. Please do keep us in the loop, sharing any inshights you have learned while solving the issue or after raising the support ticket. These insights will help others here in the community with similar issues.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.