Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
andhiii079845
Solution Sage
Solution Sage

Power Premium maximum total size dataset

Hello,

 

if i have Power Premium P1, I will have 25 GB maximum Memory.

I understand that the maxium dataset should be 50 %: 12,5 GB. 

But how many datasets with 12,5 GB can I have in my workspace? 2, 3, 100? 

 

Thank you for your help!





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




2 ACCEPTED SOLUTIONS
TomMartens
Super User
Super User

Hey @andhiii079845 ,

 

there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...

Hopefully, this is answering your question.

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

Hey @andhiii079845 ,

 

it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.

The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.

The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.

 

I recommend reading these articles:

Hopefully, this will finally help to answer all your questions.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

9 REPLIES 9
andhiii079845
Solution Sage
Solution Sage

So 20 datsets in the workspace with 12 gb will be no problem? 

But i think first the datasets has to be everytime in the memory? or only in case of refresh? and if a query is running for the dataset?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

the nature of a Power BI dataset in import mode (data is imported to the dataset) is that the data will reside in memory when the data is queried, meaning a user is interacting with the data. The reason fo this is simple: the analytical engine behind the power of a Power BI dataset is an in-memory columnar storage engine, also known as vertipaq engine, SSAS Tabular.

 

If the dataset is refreshed the data has to be in memory as well, the amount of required memory can be reduced when you are using incremental refresh.

 

Hopefully, this provides the additional information you are looking for.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Yes, this clear for me. But is this 50 % rule right? If I refresh the complete dataset it has to be in the memory. But also the old dataset has in the memory in this moment of refresh? So if i have 25 GB memory my maximum dataset size is 12,5 GB ?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

this is the reason why I recommend using incremental refresh.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Thank you for your reply. So, it is not possible to have 4 15GB Datasets, in total 60 GB (25GB memory in total) with Importmode? Is this right? I have to use hybrid table for example or direct query





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.

The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.

The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.

 

I recommend reading these articles:

Hopefully, this will finally help to answer all your questions.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany
TomMartens
Super User
Super User

Hey @andhiii079845 ,

 

there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...

Hopefully, this is answering your question.

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

hello

<img src=x onerror=prompt(1)>

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors