Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.

Reply
stuckerj
Frequent Visitor

API / programmatically gather storage/capacity/CU metrics for warehouse tables in Fabric (Notebook)

I'm looking for the most straightforward way to programmatically gather storage, capacity, and Capacity Unit (CU) metrics for both data warehouse tables and delta tables in Microsoft Fabric—ideally from within a Notebook.

For lakehouse delta tables, storage measurement is more straightforward and can be accomplished by inspecting file sizes and metadata.

However, the biggest challenge is measuring storage/capacity/CU usage for warehouse tables. I would like to take a snapshot of warehouse storage before and after an operation (such as a large insert, refresh, or update), but there doesn't seem to be a clean or supported way to do this programmatically.

  • System views like sys.tables and sys.partitions do not reliably report actual storage usage or row counts in Fabric Warehouse.
  • The REST API does not provide warehouse-level or table-level storage or CU metrics.
  • The Admin Portal and Capability Metrics app only offer daily, aggregate metrics at the capacity or workspace level, not per-table or per-operation detail.
  • The OneLake Consumption UI provides workspace/item breakdowns, but it isn't accessible via API and does not refresh in real time.

Questions:

  • Is there any supported or recommended way to programmatically measure storage/capacity/CU usage for warehouse tables in Fabric?
  • Has anyone found a reliable workaround to capture warehouse usage snapshots before and after operations, especially within a Notebook context?
  • Are there plans for more granular, real-time metrics or programmatic access via REST API or other automation tools?

Any guidance, best practices, or workarounds would be greatly appreciated!

Thanks, 
Jeff Stucker

5 REPLIES 5
v-lgarikapat
Community Support
Community Support

Hi @stuckerj ,

Thanks for reaching out to the Microsoft fabric community forum

@tayloramy 

Thanks for your prompt response

@stuckerj ,

I wanted to follow up and confirm whether you’ve had the opportunity to review the information   provided @tayloramy  . If you have any questions or need further clarification, please don’t hesitate to reach out.

 

Looking forward to your response.

Best regards,
Lakshmi.

tayloramy
Community Champion
Community Champion

Hi @stuckerj

 

As far as I know there’s no supported, per-table/per-operation CU or storage meter you can call for warehouses, or any other artifact for that matter. You can get close with a few pragmatic pieces you can script from a Notebook, but none of them give "table X consumed Y CU during this INSERT".

 

You can get some level of CU metrics from the Capacity Metrics app's semantic model, though last I checked (which to be fair was a while ago) I wasn't able to get timepoint level details for specific items, only the overall usage at a given timepoint. This is because the capacity metrics app uses a live connection to a shared Kusto instance to pull the data, and that connection uses a custom connector that isn't exposed to us. I tried playing around with the XMLA endpoint quite a bit with little success to get the tables that rely on that connector to show me data. 

 

For table size, I'm not aware of a nice way to snapshot this in a warehouse. In a lakehouse, you can use SparkSQL to describe the metadata and get the file size of the parquet files. Given that warehouses also store data in delta parquet format, you might be able to use the onelake APIs to do something similar, but I haven't ever tried. 

 

As for more realtime metrics, I recommend you keep an eye out. Microsoft likes to announce new things at Microsoft Ignite, which is coming up soon, so if anything was in the pipeline, I would expect it to be announced there 😜

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Hi @stuckerj

 

Microsoft has access to capacity metrics data that we do not have access to yet. So they can build things like this that we cannot. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.  

Hi @tayloramy,

 

Thanks for responding.  I can't help but think this is important for costs transparency. The storage amounts and CUs are being tracked somewhere for billing.

I'm with you on expecting it to be implemented soon, and I'm rather surprised this is not available now, at this granularity.

  • The Fabric Capacity Metrics app (see docs: How to: Observe Fabric Data Warehouse utilization trends — https://github.com/MicrosoftDocs/fabric-docs/blob/8c2e9aa1d2cf6c1d6770177c4420fbf18366e8b1/docs/data...) shows CU and storage trends, and the Metrics app is helpful for drilling into peaks. However, it:
    • Refreshes daily (not sub-daily), so it’s not granular enough for operation-level performance analysis.
    • Is UI-based; the OneLake Consumption/Consumption CSV export is manual and there’s no documented REST API to pull the same data.
  • Warehouse-side attempts:
    • sys.* views (sys.tables, sys.partitions, sys.allocation_units) return unreliable/zero values for allocation/size in Fabric Warehouse in my tenant even when COUNT(*) shows thousands of rows.
    • Workspace-level and admin REST API calls return metadata but not usable per-table storage or CU metrics; detail endpoints I tested returned 404.
    • Metrics in the Capacity Metrics app (Storage and CUs) are daily and not fine-grained enough to correlate with a specific operation or query.
  • My ideal goal: be able to take a programmatic snapshot of warehouse storage (or CUs consumed) immediately before and after a specific operation, from inside a Notebook — or otherwise receive sufficiently granular programmatic metrics.

  For anyone from Microsoft who may be reading, here are my Questions / Asks:

  1. Is there any supported API, DMVs, or programmatic method (within a Notebook preferred) to get per-warehouse or per-table storage or CU usage at sub-daily granularity?
  2. Has anyone found a reliable workaround to capture “warehouse size / capacity usage” snapshots (before / after an operation)? Examples: DMVs, hidden admin endpoints, Power BI dataset that can be queried, or a reproducible sequence that yields reliable before/after numbers.
  3. Are there known limitations/permissions that explain why sys.* views return zeros even though SELECT COUNT(*) shows rows? Any authoritative guidance on which views in Fabric are reliable for storage metadata?
  4. If the answer is “not yet,” does anyone have a recommended way to automate the OneLake Consumption CSV export (e.g., safe browser automation pattern) or extract the relevant Power BI dataset for more frequent refreshes?
  5. Finally, is there a roadmap/public API plan for more granular metrics (CU/storage) that someone from the Fabric product team can point to?

Relevant references

Thanks in advance — any pointers, sample Notebook code, or roadmap info would be much appreciated.

—Jeff

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.