Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
When should I choose a small data model vs. a large data model in Microsoft Fabric, and what are the key differences in terms of performance, governance, and scalability?
Thanks in advance
Solved! Go to Solution.
Hello @ablarrosa8,
Small data models
By default, semantic models (datasets) are limited to 1 GB when hosted in shared capacity or with Pro/PPU without Premium/Fabric capacity.
Large data models
By enabling the Large semantic model storage format, models can exceed 1 GB.
The maximum size is then defined by your Fabric or Premium capacity SKU (F-SKU, P-SKU, Embedded A SKU) or by the capacity admin.
The Large model storage format is enabled in the dataset settings in the Power BI/Fabric service.
For existing models, no republishing from Power BI Desktop is required—just turn the setting On in the service.
Small models → faster refresh, queries, and lower resource consumption.
Large models → scalable, enterprise-grade, but more resource-intensive and dependent on capacity.
Additional benefits
XMLA write performance: Even if your model is small, enabling the large format improves XMLA write operations.
Default partitions: Large models use 8M-row default partitions, consistent with Azure Analysis Services best practices.
Source : https://learn.microsoft.com/en-us/fabric/enterprise/powerbi/service-premium-large-models
In general, stick with a small model for simplicity, performance, and lower resource usage.
But if your dataset grows in volume or you need better performance at scale, activate the large semantic model format to leverage enterprise-grade features.
Hope it can help you!
Best regards,
Antoine
Hi @ablarrosa8 ,
Thanks for reaching out to the Microsoft fabric community forum.
I would also take a moment to thank @AntoineW , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hi @ablarrosa8 ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hi @ablarrosa8 ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hello @ablarrosa8,
Small data models
By default, semantic models (datasets) are limited to 1 GB when hosted in shared capacity or with Pro/PPU without Premium/Fabric capacity.
Large data models
By enabling the Large semantic model storage format, models can exceed 1 GB.
The maximum size is then defined by your Fabric or Premium capacity SKU (F-SKU, P-SKU, Embedded A SKU) or by the capacity admin.
The Large model storage format is enabled in the dataset settings in the Power BI/Fabric service.
For existing models, no republishing from Power BI Desktop is required—just turn the setting On in the service.
Small models → faster refresh, queries, and lower resource consumption.
Large models → scalable, enterprise-grade, but more resource-intensive and dependent on capacity.
Additional benefits
XMLA write performance: Even if your model is small, enabling the large format improves XMLA write operations.
Default partitions: Large models use 8M-row default partitions, consistent with Azure Analysis Services best practices.
Source : https://learn.microsoft.com/en-us/fabric/enterprise/powerbi/service-premium-large-models
In general, stick with a small model for simplicity, performance, and lower resource usage.
But if your dataset grows in volume or you need better performance at scale, activate the large semantic model format to leverage enterprise-grade features.
Hope it can help you!
Best regards,
Antoine