cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.

Reply
andhiii079845
Super User
Super User

Power Premium maximum total size dataset

Hello,

 

if i have Power Premium P1, I will have 25 GB maximum Memory.

I understand that the maxium dataset should be 50 %: 12,5 GB. 

But how many datasets with 12,5 GB can I have in my workspace? 2, 3, 100? 

 

Thank you for your help!





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




2 ACCEPTED SOLUTIONS
TomMartens
Super User
Super User

Hey @andhiii079845 ,

 

there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...

Hopefully, this is answering your question.

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

Hey @andhiii079845 ,

 

it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.

The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.

The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.

 

I recommend reading these articles:

Hopefully, this will finally help to answer all your questions.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

9 REPLIES 9
andhiii079845
Super User
Super User

So 20 datsets in the workspace with 12 gb will be no problem? 

But i think first the datasets has to be everytime in the memory? or only in case of refresh? and if a query is running for the dataset?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

the nature of a Power BI dataset in import mode (data is imported to the dataset) is that the data will reside in memory when the data is queried, meaning a user is interacting with the data. The reason fo this is simple: the analytical engine behind the power of a Power BI dataset is an in-memory columnar storage engine, also known as vertipaq engine, SSAS Tabular.

 

If the dataset is refreshed the data has to be in memory as well, the amount of required memory can be reduced when you are using incremental refresh.

 

Hopefully, this provides the additional information you are looking for.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Yes, this clear for me. But is this 50 % rule right? If I refresh the complete dataset it has to be in the memory. But also the old dataset has in the memory in this moment of refresh? So if i have 25 GB memory my maximum dataset size is 12,5 GB ?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

this is the reason why I recommend using incremental refresh.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Thank you for your reply. So, it is not possible to have 4 15GB Datasets, in total 60 GB (25GB memory in total) with Importmode? Is this right? I have to use hybrid table for example or direct query





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Hey @andhiii079845 ,

 

it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.

The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.

The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.

 

I recommend reading these articles:

Hopefully, this will finally help to answer all your questions.

 

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany
TomMartens
Super User
Super User

Hey @andhiii079845 ,

 

there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...

Hopefully, this is answering your question.

Regards,

Tom



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

hello

<img src=x onerror=prompt(1)>

Helpful resources

Announcements
PBI November 2023 Update Carousel

Power BI Monthly Update - November 2023

Check out the November 2023 Power BI update to learn about new features.

Community News

Fabric Community News unified experience

Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.

Power BI Fabric Summit Carousel

The largest Power BI and Fabric virtual conference

130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.

Top Solution Authors
Top Kudoed Authors