Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
We have around 25 workspaces with several semantic models within each of them.
These semantic models include ones which are classifed as "Large Semantic Model format"
I would like to get a list of the models in each workspace which are loaded as "Large Semantic model"
Is there an easy way to get this list, instead of manually opening the settings of each model and looking it up?
(Why do we need this list? We are in the middle of moving capacity regions, and large semantic models are not eligible to be moved across different regions. So we need this list of models for which we need to do backup and restore)
Solved! Go to Solution.
You can use the semantic-link-labs function list_datasets found here: https://github.com/microsoft/semantic-link-labs/blob/main/src/sempy_labs/admin/_datasets.py. Use it in a notebook. You'll have to be a Fabric admin to execute the notebook with this command. There is a field returned called Target Storage Mode, and I believe a value of PremiumFiles designates large storage format.
Hope this helps,
available also in the API: Datasets - Get Dataset - REST API (Power BI Power BI REST APIs) | Microsoft Learn
available also in the API: Datasets - Get Dataset - REST API (Power BI Power BI REST APIs) | Microsoft Learn
Hi @balafbm,
Thank you for reaching out to the Microsoft Fabric Community Forum.
After reviewing the details you provided, here are a few workarounds that might help resolve the issue. Please follow the steps below:
Thank you, @blopez11 for your valuable input regarding the issue.
By using PowerShell scripts and the Power BI REST API, you can automate the process and retrieve the necessary information efficiently.
PowerShell scripts can be used to automate the retrieval of information about your semantic models. By leveraging the Power BI REST API, you can programmatically access the metadata of your models and filter those classified as Large Semantic Model format.
The Power BI REST API provides endpoints to list datasets and their properties. You can use these endpoints to fetch details about all datasets in your workspaces and then filter them based on the Large Semantic Model format classification.
Please go through the below following links for more information:
Large semantic models in Power BI Premium - Power BI | Microsoft Learn
Introduction to semantic models across workspaces - Power BI | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
Hi @balafbm,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @balafbm,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @balafbm,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.
You can use the semantic-link-labs function list_datasets found here: https://github.com/microsoft/semantic-link-labs/blob/main/src/sempy_labs/admin/_datasets.py. Use it in a notebook. You'll have to be a Fabric admin to execute the notebook with this command. There is a field returned called Target Storage Mode, and I believe a value of PremiumFiles designates large storage format.
Hope this helps,
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!