Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Anonymous
Not applicable

In premium Gen2, getting error: database exceeds the maximum size limit on disk

Hi Community,

 

I'm getting an error when I'm trying to refresh my Power BI Dataset from the Service in a Premium P1 capacity with Gen2 activated.

 

The dataset has a size around 12GB (but the power bi file has a size of few MB as it contains only a subset of data, and the source is parameterized).

 

I have two workspaces (UAT, PROD: both in same capacity), for one it's working smoothly and for the other I'm getting this error.

 

Any help would be very appreciate.

 

Thanks

4 REPLIES 4
selimovd
Super User
Super User

Hey @Anonymous ,

 

what exactly is your data source?

Try to use a relational database as SQL server. If that is already the case try to filter out the big chunks as a first step of your transformation. Does the query folding work to filter out the big chunk?

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
 
Best regards
Denis
 
Anonymous
Not applicable

Hi thanks for your reply. 

 

I'm using parquet files stored in a datalake as a source, there is no query folding. But this is working fine in one workspace. And on the other I am getting this issue: 'Database xxx exceeds the maximum size limit on disk'.

 

I saw the limitation of 12GB in import mode but Gen2 is removing this limitation right?

Hi @Anonymous ,

 

Does your problem have been solved? 

 

If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.

 

Best Regards,
Winniz

Hi @Anonymous ,

 

Do you enable "Large dataset storage format" for your dataset? Does your workspace assigned to Premium capacity have the large dataset storage format?

 

Then please check if your dataset size exceeds the maximum size of the offline dataset in memory. This is the compressed size on disk. Default value is set by SKU and the allowable range is from 0.1 – 10 GB.

 

For a full refresh, at least double the current dataset memory size is required. Please monitor memory metrics to determine if there is sufficient memory.

 

About managing and optimizing Premium capacity, please refer to 

Premium capacity scenarios 

Optimizing Premium capacities 

 

 

Best Regards,
Winniz

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.