Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
ablarrosa8
Helper I
Helper I

Differences between small data model vs. a large data model in Microsoft Fabric

Hello,

 

When should I choose a small data model vs. a large data model in Microsoft Fabric, and what are the key differences in terms of performance, governance, and scalability?

 

Thanks in advance

1 ACCEPTED SOLUTION
AntoineW
Memorable Member
Memorable Member

Hello @ablarrosa8,

 

Size limits and scalability

  • Small data models

    • By default, semantic models (datasets) are limited to 1 GB when hosted in shared capacity or with Pro/PPU without Premium/Fabric capacity.

  • Large data models

    • By enabling the Large semantic model storage format, models can exceed 1 GB.

    • The maximum size is then defined by your Fabric or Premium capacity SKU (F-SKU, P-SKU, Embedded A SKU) or by the capacity admin.

 

Activation and setup

  • The Large model storage format is enabled in the dataset settings in the Power BI/Fabric service.

  • For existing models, no republishing from Power BI Desktop is required—just turn the setting On in the service.

 

Performance trade-offs

  • Small models → faster refresh, queries, and lower resource consumption.

  • Large models → scalable, enterprise-grade, but more resource-intensive and dependent on capacity.

 

Additional benefits

  • XMLA write performance: Even if your model is small, enabling the large format improves XMLA write operations.

  • Default partitions: Large models use 8M-row default partitions, consistent with Azure Analysis Services best practices.

 

Source : https://learn.microsoft.com/en-us/fabric/enterprise/powerbi/service-premium-large-models

 

In general, stick with a small model for simplicity, performance, and lower resource usage.
But if your dataset grows in volume or you need better performance at scale, activate the large semantic model format to leverage enterprise-grade features.

 

Hope it can help you!

Best regards,

Antoine

View solution in original post

4 REPLIES 4
v-menakakota
Community Support
Community Support

Hi @ablarrosa8   ,
Thanks for reaching out to the Microsoft fabric community forum. 

 

I would also take a moment to thank  @AntoineW , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team 

Hi @ablarrosa8 ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team 

Hi @ablarrosa8 ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team

AntoineW
Memorable Member
Memorable Member

Hello @ablarrosa8,

 

Size limits and scalability

  • Small data models

    • By default, semantic models (datasets) are limited to 1 GB when hosted in shared capacity or with Pro/PPU without Premium/Fabric capacity.

  • Large data models

    • By enabling the Large semantic model storage format, models can exceed 1 GB.

    • The maximum size is then defined by your Fabric or Premium capacity SKU (F-SKU, P-SKU, Embedded A SKU) or by the capacity admin.

 

Activation and setup

  • The Large model storage format is enabled in the dataset settings in the Power BI/Fabric service.

  • For existing models, no republishing from Power BI Desktop is required—just turn the setting On in the service.

 

Performance trade-offs

  • Small models → faster refresh, queries, and lower resource consumption.

  • Large models → scalable, enterprise-grade, but more resource-intensive and dependent on capacity.

 

Additional benefits

  • XMLA write performance: Even if your model is small, enabling the large format improves XMLA write operations.

  • Default partitions: Large models use 8M-row default partitions, consistent with Azure Analysis Services best practices.

 

Source : https://learn.microsoft.com/en-us/fabric/enterprise/powerbi/service-premium-large-models

 

In general, stick with a small model for simplicity, performance, and lower resource usage.
But if your dataset grows in volume or you need better performance at scale, activate the large semantic model format to leverage enterprise-grade features.

 

Hope it can help you!

Best regards,

Antoine

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.