Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hello,
if i have Power Premium P1, I will have 25 GB maximum Memory.
I understand that the maxium dataset should be 50 %: 12,5 GB.
But how many datasets with 12,5 GB can I have in my workspace? 2, 3, 100?
Thank you for your help!
Proud to be a Super User!
Solved! Go to Solution.
Hey @andhiii079845 ,
there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...
Hopefully, this is answering your question.
Regards,
Tom
Hey @andhiii079845 ,
it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.
The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.
The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.
I recommend reading these articles:
Hopefully, this will finally help to answer all your questions.
Regards,
Tom
So 20 datsets in the workspace with 12 gb will be no problem?
But i think first the datasets has to be everytime in the memory? or only in case of refresh? and if a query is running for the dataset?
Proud to be a Super User!
Hey @andhiii079845 ,
the nature of a Power BI dataset in import mode (data is imported to the dataset) is that the data will reside in memory when the data is queried, meaning a user is interacting with the data. The reason fo this is simple: the analytical engine behind the power of a Power BI dataset is an in-memory columnar storage engine, also known as vertipaq engine, SSAS Tabular.
If the dataset is refreshed the data has to be in memory as well, the amount of required memory can be reduced when you are using incremental refresh.
Hopefully, this provides the additional information you are looking for.
Regards,
Tom
Yes, this clear for me. But is this 50 % rule right? If I refresh the complete dataset it has to be in the memory. But also the old dataset has in the memory in this moment of refresh? So if i have 25 GB memory my maximum dataset size is 12,5 GB ?
Proud to be a Super User!
Hey @andhiii079845 ,
this is the reason why I recommend using incremental refresh.
Regards,
Tom
Thank you for your reply. So, it is not possible to have 4 15GB Datasets, in total 60 GB (25GB memory in total) with Importmode? Is this right? I have to use hybrid table for example or direct query
Proud to be a Super User!
Hey @andhiii079845 ,
it is possible when you turn the large dataset switch on (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models), and start using incremental refresh.
The limit that comes with P[x] capacity is per each dataset, of course it's not possbile that two large datasets (assuming 18GB each) are in memory at the same time when on P1.
The Power BI Service is very capable in managing it's resources, by evicting datasets from memory when they become outdated and reloading datasets when required.
I recommend reading these articles:
Hopefully, this will finally help to answer all your questions.
Regards,
Tom
Hey @andhiii079845 ,
there is a storage limit of 100TB per capacity. Here you will find the article: https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi?...
Hopefully, this is answering your question.
Regards,
Tom
hello
<img src=x onerror=prompt(1)>
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.