Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.

Reply
dgwilson
Resolver III
Resolver III

Determine dataset (semantic model) size > 1GB

For reasons I won't go into our organisation is looking at a home tenant move from one region to another.

One of the call outs from Microsoft is to identify datasets that are greater than 1GB in size ... as all the capacities need to be unloaded and the ones with datasets greater than 1GB will not work until everything is all put back together.

 

Given we have 3000+ datasets I'm looking for a scripted way of identifying these datasets. 

I'd love for that to be with the Power BI REST API calls - though I'll take what I can get.

 

Any pointers (scripts even) would be hugely appreciated.

 

- David

7 REPLIES 7
pvuppala
Helper IV
Helper IV

Our Microsoft SME suggested to look at this to get the model size.

https://semantic-link-labs.readthedocs.io/en/latest/sempy_labs.html#sempy_labs.get_semantic_model_si...

 

Have you tried it? Does it give you the compressed or decompressed size?

pvuppala
Helper IV
Helper IV

ANy luck getting the dataset size via API?  I'm tasked to do the same and we have over 1000+ workspaces.

I can't remember what I ended up doing for this ... however as mentioned elsewhere in this thread is the Capacity Metrics App. By Capacity (it won't tell you what is in the global shared envionment).
In my case this gave me quickly the bulk of what I needed.
You can build your own app against this data as well and in doing so add the capacity name as a column in the report.... I might have done this.

2024-12-18 07_03_10-Fabric Capacity Metrics - Power BI.png

No, and that size is pretty meaningless anyway as you need to consider the effects of compression.

v-zhengdxu-msft
Community Support
Community Support

Hi @dgwilson 

 

 

Here you can click the workspace setting in the workspace:

vzhengdxumsft_0-1724382912437.png

Then you can see the size of the semantic models in the System storage:

vzhengdxumsft_1-1724382984656.png

It's worth mentioning that I don't see an API in the Rest API that returns semantic model size.

 

Best Regards

Zhengdong Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

lbendlin
Super User
Super User

That's a tough one.  The standard process is to retrieve that information from the Fabric Capacity Monitoring App.  However that only includes artefacts that have been active in the last 14 days. And it won't tell you what the size in memory is (which will eventually determine if the model is rejected or not). Compression is hugely dependent on the type of data and its sorting.

 

Start with the app but be aware of the limitations.

 

 

Helpful resources

Announcements
Feb2025 Sticker Challenge

Join our Community Sticker Challenge 2025

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!