Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
powerbiexpert22
Impactful Individual
Impactful Individual

large semantic models limit

i was going through below document and unable to find the maximum size or limt of large semantic models

 

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models

 

 

1 ACCEPTED SOLUTION

Hi @powerbiexpert22 

 

Generally the Tenant / BI Administrator of your organization will be able to tell you which department is using Power BI Premium Capacity. 

As per the below link, the max memory gb mentions the allowed memory limit as per F-SKU in Fabric. 

If you do not have MS Fabric yet, then for large semantic models beyond 1 GB, need to have P1 capacity/F64 which allows max 25 gb memory. 

Without premium capacity, the shared capacity can be used which allows model size only up to 1 gb.

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is

sizepowerbi.png

Hope I have answered your question.

Thanks,

Pallavi

View solution in original post

6 REPLIES 6
powerbiexpert22
Impactful Individual
Impactful Individual

there should be because the data is stored in cloud which means someone should be responsible to pay the price. imagine if someone keep storing lots of data in large semantic model then eventually some one has to pay for it right?

Hi @powerbiexpert22 

 

Generally the Tenant / BI Administrator of your organization will be able to tell you which department is using Power BI Premium Capacity. 

As per the below link, the max memory gb mentions the allowed memory limit as per F-SKU in Fabric. 

If you do not have MS Fabric yet, then for large semantic models beyond 1 GB, need to have P1 capacity/F64 which allows max 25 gb memory. 

Without premium capacity, the shared capacity can be used which allows model size only up to 1 gb.

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is

sizepowerbi.png

Hope I have answered your question.

Thanks,

Pallavi

Thanks a lot @pallavi_r , this is what I was looking for

Hi @powerbiexpert22 ,

 

Memory and storage are different. If you have F64 capacity and the dataset refresh requires more than 25GB of memory, then you can expect a refresh failure.As per my experience it will handle if it not going to take more and if you are refreshing it in non-peak hours.

 

Thanks,

Sai Teja 

Hi @powerbiexpert22 ,

 

The team who owns that premium capacity in your organisation pays that amount.

Please go through it -

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is

 

In our organisation we don't allow everyone to use premium capacity or fabric features.

 

Thanks,

Sai Teja 

SaiTejaTalasila
Super User
Super User

Hi @powerbiexpert22 ,

 

I don't think we have any size limitations currently for large semantic models.

 

Thanks,

Sai Teja 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors