Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
For reasons I won't go into our organisation is looking at a home tenant move from one region to another.
One of the call outs from Microsoft is to identify datasets that are greater than 1GB in size ... as all the capacities need to be unloaded and the ones with datasets greater than 1GB will not work until everything is all put back together.
Given we have 3000+ datasets I'm looking for a scripted way of identifying these datasets.
I'd love for that to be with the Power BI REST API calls - though I'll take what I can get.
Any pointers (scripts even) would be hugely appreciated.
- David
Our Microsoft SME suggested to look at this to get the model size.
Have you tried it? Does it give you the compressed or decompressed size?
ANy luck getting the dataset size via API? I'm tasked to do the same and we have over 1000+ workspaces.
I can't remember what I ended up doing for this ... however as mentioned elsewhere in this thread is the Capacity Metrics App. By Capacity (it won't tell you what is in the global shared envionment).
In my case this gave me quickly the bulk of what I needed.
You can build your own app against this data as well and in doing so add the capacity name as a column in the report.... I might have done this.
No, and that size is pretty meaningless anyway as you need to consider the effects of compression.
Hi @dgwilson
Here you can click the workspace setting in the workspace:
Then you can see the size of the semantic models in the System storage:
It's worth mentioning that I don't see an API in the Rest API that returns semantic model size.
Best Regards
Zhengdong Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
That's a tough one. The standard process is to retrieve that information from the Fabric Capacity Monitoring App. However that only includes artefacts that have been active in the last 14 days. And it won't tell you what the size in memory is (which will eventually determine if the model is rejected or not). Compression is hugely dependent on the type of data and its sorting.
Start with the app but be aware of the limitations.