Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
How can I see the size of a semantic model in fabric?
And also, if possible, can I see history (size per day)?
Are you asking about Direct Lake semantic models or Import Mode semantic models?
Are you asking about the size on disk or the size that gets loaded into memory when the model is being used?
What will you use that information for?
I am using direkt lake only
suddenly my model stopped working
no table exceeds the row numbe rlimit so I thought that the issue is the total model size
I had my model as direkt lake only and suddenly It stopped working due to many rows. no table exceeds the row limit but it might me that the whole model exceeds the max memory size (10 GB for F8)
I see, okay I think the semantic link approach (and the DAX studio approach) would be useful for that.
I also found this video interesting:
https://m.youtube.com/watch?v=vI7kVVHxsiM#
(even though this is for import mode, so it's not 100% the same for direct lake)
So what I would do:
- I would open the report
- Check the in-memory size of the semantic model by using DAX studio or semantic link
- Then visit some report pages
- Check the in-memory size again by using DAX studio or semantic link, to see if any of the report pages has caused a big increase in what's loaded into memory.
I would use a custom direct lake semantic model (not the default direct lake semantic model).
If necessary, I would clear the semantic model's cache in order to gain better control of the testing. Clearing the cache can be done by refreshing the direct lake semantic model, or DAX studio.
What did the error message say, btw?
Here's also a blog series which explains some memory-related concepts in Power BI: https://blog.crossjoin.co.uk/2024/04/28/power-bi-semantic-model-memory-errors-part-1-model-size/amp/
Unfortunately, I don't think the memory_usage(deep=True) function will give you the result you're looking for. From my understanding, the memory_usage(deep=True) is a python function for getting the in-memory size of a dataframe. However, you are interested in Power BI semantic model size, not dataframe size.
thanks
see error message below
I will try as soon as I have either DAX staudio or get semantic link to work
Hm.. the message says too many rows.
Anyway, the semantic-link-labs vertipaq analyzer will give you all the information about number of rows, table size, etc.
Semantic-link-labs is a bit different than the "normal" semantic link. https://semantic-link-labs.readthedocs.io/en/latest/
Semantic link labs need installation, but semantic link is already installed.
The "normal" Semantic link list_columns function also gives some of the useful information, when you use the extended option.
thanks, I need more details
I have checked the warehouse and no table exceeds the limit for number of rows
This code gives some answers:
import sempy.fabric as fabric
datasetName = "TestSemanticModelSize (custom)"
fabric.list_columns(datasetName, extended=True)
Also see a more elaborate blog here:
Calculating and Reducing Power BI Dataset Size Using Semantic Link
I think you can use this, it worked for me:
%pip install semantic-link-labs
import sempy_labs as labs
datasetName = "TestSemanticModelSize (custom)"
labs.get_semantic_model_size(datasetName)
labs.vertipaq_analyzer(datasetName)
labs.refresh_semantic_model(datasetName)
sempy_labs — semantic-link-labs 0.8.4 documentation
Explanation:
The semantic model size is measured as the size that is loaded into memory.
When you interact with a direct lake report, the semantic model columns you interact with get loaded into memory (warm up).
So the semantic model will grow in size as report users query more columns from the semantic model.
After a period of inactivity, the columns get removed from memory automatically (cool down).
When you refresh the dataset, alle the columns get removed from memory. So the model size (in memory) will be almost 0 after a refresh.
You can run the get_semantic_model_size() and vertipaq_analyzer() to verify the in-memory model size.
So you could
If you want to, you can refresh the semantic model to reset the semantic model size again to (almost) 0.
Some discussion about semantic link labs:
Should I use Semantic Link Labs in prod? : r/MicrosoftFabric
Hi @joakimfenno,
You can also try to use Semantic funciton to get the data size of the specific semantic model.
Semantic functions - Microsoft Fabric | Microsoft Learn
Here is a sample to read table data size from model, you can modify it to add for loop to summary all the table size.
import sempy.fabric as fabric
# List the available semantic models
df_datasets = fabric.list_datasets()
print(df_datasets)
# Get the size of a specific semantic model
model_name = "Your_Semantic_Model_Name"
df_model = fabric.read_table(model_name, "Your_Table_Name")
# Calculate the size of the data
model_size = df_model.memory_usage(deep=True).sum()
print(f"Size of the semantic model '{model_name}': {model_size} bytes")
Regards,
Xiaoxin Sheng
thanks its a good idea
I get error when running that script from a notebook within fabric
is it anything else I need to configure to get it to work?
The package has been added to the library in fabric
this is part of the log
--------------------------------------------------------------------------- ExtendedSocketException Traceback (most recent call last) ExtendedSocketException: (00000005, 0xFFFDFFFF): Name or service not known at System.Net.Dns.GetHostEntryOrAddressesCore(String hostName, Boolean justAddresses, AddressFamily addressFamily, ValueStopwatch stopwatch) at System.Net.Dns.GetHostAddresses(String hostNameOrAddress, AddressFamily family) at System.Net.Dns.GetHostAddresses(String hostNameOrAddress) at System.Net.Sockets.Socket.Connect(String host, Int32 port) at System.Net.Sockets.Socket.Connect(EndPoint remoteEP) at System.Net.HttpWebRequest.<>c__DisplayClass216_0.<<CreateHttpClient>b__1>d.MoveNext() --- End of stack trace from previous location --- at System.Net.Http.HttpConnectionPool.ConnectToTcpHostAsync(String host, Int32 port, HttpRequestMessage initialRequest, Boolean async, CancellationToken cancellationToken)
HI @joakimfenno,
In fact, I already attach in the environment so I can simply import this. If you can't add it to the environment, you can try to use %pip command to inline install it before use.
Manage Apache Spark libraries - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
Goto workspace settings>> click on system storage(available at left side item list), then u will get all sematic models size in fabric
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Goto workspace settings>> click on system storage(available at left side item list), then u will get all sematic models size in fabric
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
under workspace setting, you can able to see the semantic model size. please refer below snippet.
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
its says 1 MB for all models (as in your example)
what does Size = 1 MB mean?
2MB also there in my example, it means , i addedd just very limited data, that is why my semantic model is having that size, if you do same thing you may get different size. try and let me know
No they are all 1MB so it does not make sense
cant represent the full model
i believe my models are > 10 GB
if i scroll down and see, the size is different for semantic model, it is rounding off to MBs even though one sematic model is having .8MB.. or less than 1MB, its rounding to 1MB, if 1.7 rounding to 2MB like that
All minre are 1-2 MB
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |