Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
joakimfenno
Helper V
Helper V

Determine size of semantic model

How can I see the size of a semantic model in fabric?

And also, if possible, can I see history (size per day)?

22 REPLIES 22
frithjof_v
Super User
Super User

Are you asking about Direct Lake semantic models or Import Mode semantic models?

 

Are you asking about the size on disk or the size that gets loaded into memory when the model is being used?

 

What will you use that information for?

I am using direkt lake only

suddenly my model stopped working 

no table exceeds the row numbe rlimit so I thought that the issue is the total model size

 

joakimfenno_0-1731268722920.png

 

I had my model as direkt lake only and suddenly It stopped working due to many rows. no table exceeds the row limit but it might me that the whole model exceeds the max memory size (10 GB for F8)

I see, okay I think the semantic link approach (and the DAX studio approach) would be useful for that.

 

I also found this video interesting:

 

https://m.youtube.com/watch?v=vI7kVVHxsiM#

(even though this is for import mode, so it's not 100% the same for direct lake)

 

 

So what I would do:

 

- I would open the report

- Check the in-memory size of the semantic model by using DAX studio or semantic link

- Then visit some report pages

- Check the in-memory size again by using DAX studio or semantic link, to see if any of the report pages has caused a big increase in what's loaded into memory.

 

I would use a custom direct lake semantic model (not the default direct lake semantic model).

 

If necessary, I would clear the semantic model's cache in order to gain better control of the testing. Clearing the cache can be done by refreshing the direct lake semantic model, or DAX studio.

 

What did the error message say, btw?

 

Here's also a blog series which explains some memory-related concepts in Power BI: https://blog.crossjoin.co.uk/2024/04/28/power-bi-semantic-model-memory-errors-part-1-model-size/amp/

 

 

Unfortunately, I don't think the memory_usage(deep=True) function will give you the result you're looking for. From my understanding, the memory_usage(deep=True) is a python function for getting the in-memory size of a dataframe. However, you are interested in Power BI semantic model size, not dataframe size.

thanks

see error message below

I will try as soon as I have either DAX staudio or get semantic link to work

 

joakimfenno_0-1731269143236.png

 

 

Hm.. the message says too many rows.

 

Anyway, the semantic-link-labs vertipaq analyzer will give you all the information about number of rows, table size, etc.

Semantic-link-labs is a bit different than the "normal" semantic link. https://semantic-link-labs.readthedocs.io/en/latest/

Semantic link labs need installation, but semantic link is already installed.

 

The "normal" Semantic link list_columns function also gives some of the useful information, when you use the extended option.

thanks, I need more details
I have checked the warehouse and no table exceeds the limit for number of rows

This code gives some answers:

 

import sempy.fabric as fabric
datasetName = "TestSemanticModelSize (custom)"
fabric.list_columns(datasetName, extended=True)

 

 

Also see a more elaborate blog here:

Calculating and Reducing Power BI Dataset Size Using Semantic Link

 

https://learn.microsoft.com/en-us/python/api/semantic-link-sempy/sempy.fabric?view=semantic-link-pyt...

I think you can use this, it worked for me:

 

 

%pip install semantic-link-labs
import sempy_labs as labs

 

 

 

datasetName = "TestSemanticModelSize (custom)"

 

 

 

labs.get_semantic_model_size(datasetName)

 

 

 

labs.vertipaq_analyzer(datasetName)

 

 

 

labs.refresh_semantic_model(datasetName)

 

 

sempy_labs — semantic-link-labs 0.8.4 documentation

 

Explanation:

The semantic model size is measured as the size that is loaded into memory.

When you interact with a direct lake report, the semantic model columns you interact with get loaded into memory (warm up).

So the semantic model will grow in size as report users query more columns from the semantic model.

After a period of inactivity, the columns get removed from memory automatically (cool down).

When you refresh the dataset, alle the columns get removed from memory. So the model size (in memory) will be almost 0 after a refresh.

 

You can run the get_semantic_model_size() and vertipaq_analyzer() to verify the in-memory model size.

 

So you could

 

  1. Refresh the semantic model to start (almost) at 0.
  2. Verify the in-memory model size by using get_semantic_model_size() or vertipaq_analyzer().
  3. Interact with some pages in the report. This means that the semantic model columns which are referenced on those pages, will be loaded into memory.  
  4. Verify the new in-memory semantic model size by using get_semantic_model_size() or vertipaq_analyzer().

 

If you want to, you can refresh the semantic model to reset the semantic model size again to (almost) 0.

 

Some discussion about semantic link labs:

Should I use Semantic Link Labs in prod? : r/MicrosoftFabric

 

Anonymous
Not applicable

Hi @joakimfenno,

You can also try to use Semantic funciton to get the data size of the specific semantic model.

Semantic functions - Microsoft Fabric | Microsoft Learn

Here is a sample to read table data size from model, you can modify it to add for loop to summary all the table size.

import sempy.fabric as fabric
# List the available semantic models
df_datasets = fabric.list_datasets()
print(df_datasets)

# Get the size of a specific semantic model
model_name = "Your_Semantic_Model_Name"

df_model = fabric.read_table(model_name, "Your_Table_Name")

# Calculate the size of the data
model_size = df_model.memory_usage(deep=True).sum()
print(f"Size of the semantic model '{model_name}': {model_size} bytes")

Regards,

Xiaoxin Sheng

thanks its a good idea
I get error when running that script from a notebook within fabric
is it anything else I need to configure to get it to work?

 

The package has been added to the library in fabric

 

joakimfenno_0-1731070736170.png

 

 

this is part of the log

 

--------------------------------------------------------------------------- ExtendedSocketException Traceback (most recent call last) ExtendedSocketException: (00000005, 0xFFFDFFFF): Name or service not known at System.Net.Dns.GetHostEntryOrAddressesCore(String hostName, Boolean justAddresses, AddressFamily addressFamily, ValueStopwatch stopwatch) at System.Net.Dns.GetHostAddresses(String hostNameOrAddress, AddressFamily family) at System.Net.Dns.GetHostAddresses(String hostNameOrAddress) at System.Net.Sockets.Socket.Connect(String host, Int32 port) at System.Net.Sockets.Socket.Connect(EndPoint remoteEP) at System.Net.HttpWebRequest.<>c__DisplayClass216_0.<<CreateHttpClient>b__1>d.MoveNext() --- End of stack trace from previous location --- at System.Net.Http.HttpConnectionPool.ConnectToTcpHostAsync(String host, Int32 port, HttpRequestMessage initialRequest, Boolean async, CancellationToken cancellationToken)

Anonymous
Not applicable

HI @joakimfenno,

In fact, I already attach in the environment so I can simply import this. If you can't add it to the environment, you can try to use %pip command to inline install it before use.

Manage Apache Spark libraries - Microsoft Fabric | Microsoft Learn
Regards,

Xiaoxin Sheng

SudhavaniKolla3
Helper II
Helper II

Goto workspace settings>> click on system storage(available at left side item list), then u will get all sematic models size in fabric

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

SudhavaniKolla3
Helper II
Helper II

Goto workspace settings>> click on system storage(available at left side item list), then u will get all sematic models size in fabric

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

SudhavaniKolla3
Helper II
Helper II

under workspace setting, you can able to see the semantic model size. please refer below snippet.

SudhavaniKolla3_0-1731054326191.png

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

its says 1 MB for all models (as in your example)

what does Size = 1 MB mean?

2MB also there in my example, it means , i addedd just very limited data, that is why my semantic model is having that size, if you do same thing you may get different size. try and let me know

No they are all 1MB so it does not make sense

cant represent the full model

i believe my models are > 10 GB

if i scroll down and see, the size is different for semantic model, it is rounding off to MBs even though one sematic model is having .8MB.. or less than 1MB, its  rounding to 1MB, if 1.7 rounding to 2MB like that

SudhavaniKolla3_0-1731064373847.png

 

All minre are 1-2 MB

 

joakimfenno_1-1731066880301.png

 

 

 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.