Reply
NewUser777
Resolver I
Resolver I

Max Model size limit in Premium Gen2

I want to know the max model size limit and below are the specs for Gen2 Prem capacity

 

Capacity Dataset   DataflowExport API
Capacity SKUsV-coresMax memory (GB)1, 2, 3DirectQuery/Live connection (per second)1, 2Max memory per query (GB)1, 2Model refresh parallelism2Dataflow parallel tasks5Max concurrent pages6
P1/A4825306403255
1 ACCEPTED SOLUTION
djurecicK2
Super User
Super User

Hi @NewUser777 ,

 Yes, the limit for a single model is half of the listed size (assuming you want to be able to refresh the data)

 

  • Refreshing the dataset - The second action is refreshing the dataset after it's loaded into the memory. The refresh operation will cause the memory used by the dataset to double. The required memory doubles because the original copy of data is still available for active queries, while another copy is being processed by the refresh. Once the refresh transaction commits, the memory footprint will reduce.

View solution in original post

6 REPLIES 6
alohaes
Helper I
Helper I

Hi good day to you. I am also looking for the Gen2 specs specifically max concurrent pages per SKU but I am unable

to find it in the documentation. Could you please share where you found it?

djurecicK2
Super User
Super User

You're welcome @NewUser777 . I wish this was documented a little more clearly.

djurecicK2
Super User
Super User

Hi @NewUser777 ,

 Yes, the limit for a single model is half of the listed size (assuming you want to be able to refresh the data)

 

  • Refreshing the dataset - The second action is refreshing the dataset after it's loaded into the memory. The refresh operation will cause the memory used by the dataset to double. The required memory doubles because the original copy of data is still available for active queries, while another copy is being processed by the refresh. Once the refresh transaction commits, the memory footprint will reduce.

Thanks @djurecicK2 for confirming. Appreciate your help 

blopez11
Super User
Super User

For a single dataset (model), it is 25 GB.  But, that includes all memory needed to handle various simultaneous operations on the dataset (loading into memory, refresh, user interactions with report resulting in queries back to the model, etc.).  Please see https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-gen2-what-is#dataset-memory-al... as it might shed more light.

 

Thanks,

Thanks , I have also found in Premium gen 2 p1 that it don't require cumulative memory limits,

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-gen2-what-is#refreshes

 

Premium Gen2 and Embedded Gen 2 don't require cumulative memory limits, and therefore concurrent dataset refreshes don't contribute to resource constraints. 

@blopez11 

Hence the question only the data model excluding the other operatons how much max data it should hold?

 

Also if I go by documentation and the statement -

Three separate actions determine the amount of memory attributed to the original dataset, which may be larger than two times the dataset size. The total amount of memory used by one Power BI item can't exceed the SKU's Max memory per dataset allocation.

   Then  for  every 1 gb there will be 2gb          hence total max you can think of is 12 gb in P1           

avatar user

Helpful resources

Announcements
March PBI video - carousel

Power BI Monthly Update - March 2025

Check out the March 2025 Power BI update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors (Last Month)
Top Kudoed Authors (Last Month)