Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I'm looking for the most straightforward way to programmatically gather storage, capacity, and Capacity Unit (CU) metrics for both data warehouse tables and delta tables in Microsoft Fabric—ideally from within a Notebook.
For lakehouse delta tables, storage measurement is more straightforward and can be accomplished by inspecting file sizes and metadata.
However, the biggest challenge is measuring storage/capacity/CU usage for warehouse tables. I would like to take a snapshot of warehouse storage before and after an operation (such as a large insert, refresh, or update), but there doesn't seem to be a clean or supported way to do this programmatically.
Questions:
Any guidance, best practices, or workarounds would be greatly appreciated!
Thanks,
Jeff Stucker
Hi @stuckerj ,
Thanks for reaching out to the Microsoft fabric community forum
Thanks for your prompt response
I wanted to follow up and confirm whether you’ve had the opportunity to review the information provided @tayloramy . If you have any questions or need further clarification, please don’t hesitate to reach out.
Looking forward to your response.
Best regards,
Lakshmi.
Hi @stuckerj,
As far as I know there’s no supported, per-table/per-operation CU or storage meter you can call for warehouses, or any other artifact for that matter. You can get close with a few pragmatic pieces you can script from a Notebook, but none of them give "table X consumed Y CU during this INSERT".
You can get some level of CU metrics from the Capacity Metrics app's semantic model, though last I checked (which to be fair was a while ago) I wasn't able to get timepoint level details for specific items, only the overall usage at a given timepoint. This is because the capacity metrics app uses a live connection to a shared Kusto instance to pull the data, and that connection uses a custom connector that isn't exposed to us. I tried playing around with the XMLA endpoint quite a bit with little success to get the tables that rely on that connector to show me data.
For table size, I'm not aware of a nice way to snapshot this in a warehouse. In a lakehouse, you can use SparkSQL to describe the metadata and get the file size of the parquet files. Given that warehouses also store data in delta parquet format, you might be able to use the onelake APIs to do something similar, but I haven't ever tried.
As for more realtime metrics, I recommend you keep an eye out. Microsoft likes to announce new things at Microsoft Ignite, which is coming up soon, so if anything was in the pipeline, I would expect it to be announced there 😜
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Does anyone know how these numbers were gathered?
Hi @stuckerj,
Microsoft has access to capacity metrics data that we do not have access to yet. So they can build things like this that we cannot.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @tayloramy,
Thanks for responding. I can't help but think this is important for costs transparency. The storage amounts and CUs are being tracked somewhere for billing.
I'm with you on expecting it to be implemented soon, and I'm rather surprised this is not available now, at this granularity.
For anyone from Microsoft who may be reading, here are my Questions / Asks:
Relevant references
Thanks in advance — any pointers, sample Notebook code, or roadmap info would be much appreciated.
—Jeff
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.