Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.

Reply
KimTutein
Advocate II
Advocate II

Vertipaq Analyzer not able to report on memory consumption on models using “direct lake on Onelake”

Hi Community

 

We have recently moved our semantic models from “direct lake on SQL” to “direct lake on Onelake”. The models works fine but we have noticed that Vertipaq Analyzer is no longer able to show the memory consumption of the model and columns. We can see the temperature of columns but are no longer able to see the memory consumption. This is the same behavior if try using DAX Studio, Tabular Edtior 3 or a notebook with semantic link labs (labs.vertipaq_analyzer).

It seems the model size is not something that can be reported when we have a model in “Direct lake on Onelake”. However knowing the size of model and columns is rather important to know if you are close to the Sku limits for your capacity. For instance F128 = 50 GB memory limit.

 

Has anyone had the same issue and how do you then know how large your models/columns memory consumption is?

 

Friedly Regards

Kim Tutein

1 ACCEPTED SOLUTION

Hello @KimTutein,
Thank you for clarifying your need for per-model and per-column memory consumption details, which are currently available for Import models and Direct Lake on SQL but not for Direct Lake on OneLake. As you noted, the Fabric Capacity Metrics App tracks Capacity Units (CUs) but doesn’t provide granular memory stats.

This limitation in Direct Lake on OneLake is due to its architecture: Data is queried on-demand from Delta tables, loading only necessary segments into a temporary VertiPaq cache, resulting in tools like VertiPaq Analyzer reporting only metadata size This is expected behavior, as outlined in the Direct Lake overview.

Workaround to Estimate Memory Usage:

  • Download the .pbix file from your Fabric workspace and open in Power BI Desktop and switch to Import mode by connecting to your OneLake Delta tables via Get Data > Azure > Azure Synapse Analytics and selecting Import.
  • Use VertiPaq Analyzer (via View > VertiPaq Analyzer) or DAX Studio to analyze memory consumption.

To better understand resource usage during query execution, you can use Power BI Desktop’s Performance Analyzer to check if queries process in Direct Lake mode or fall back to Direct Query, which may increase resource consumption. This can provide indirect clues about memory usage patterns. For setup, see Performance Analyzer documentation.


Best regards,
Ganesh Singamshetty

View solution in original post

4 REPLIES 4
v-ssriganesh
Community Support
Community Support

Hi @KimTutein,
Thank you for reaching out to the Microsoft fabric community forum.

If I misunderstand your needs or you still have problems on it, please feel free to let us know. 
Best regards,
Ganesh Singamshetty.

Hi @v-ssriganesh 

Thank you for your answer. As far as I know thw capacity Metircs apps do not expose the memory consumption but "only" capacity units (cu). We are explicit interested in how much memory a given semantic model use with specs on the columns. This is possible for "imported models" and "direct lake on sql" but it seems it is not possible for "direct lake on onelake".

Hello @KimTutein,
Thank you for clarifying your need for per-model and per-column memory consumption details, which are currently available for Import models and Direct Lake on SQL but not for Direct Lake on OneLake. As you noted, the Fabric Capacity Metrics App tracks Capacity Units (CUs) but doesn’t provide granular memory stats.

This limitation in Direct Lake on OneLake is due to its architecture: Data is queried on-demand from Delta tables, loading only necessary segments into a temporary VertiPaq cache, resulting in tools like VertiPaq Analyzer reporting only metadata size This is expected behavior, as outlined in the Direct Lake overview.

Workaround to Estimate Memory Usage:

  • Download the .pbix file from your Fabric workspace and open in Power BI Desktop and switch to Import mode by connecting to your OneLake Delta tables via Get Data > Azure > Azure Synapse Analytics and selecting Import.
  • Use VertiPaq Analyzer (via View > VertiPaq Analyzer) or DAX Studio to analyze memory consumption.

To better understand resource usage during query execution, you can use Power BI Desktop’s Performance Analyzer to check if queries process in Direct Lake mode or fall back to Direct Query, which may increase resource consumption. This can provide indirect clues about memory usage patterns. For setup, see Performance Analyzer documentation.


Best regards,
Ganesh Singamshetty

Hi @v-ssriganesh 

 

Thank you for your detailed reply and swift answer.

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors