Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
max_bradley
Frequent Visitor

Report size issues

When I try to refresh the Semantic Model behind a report, I get the following error;

"Unable to save the changes since the new dataset of size 1484 MB exceeds the limit of 1024 MB. If the dataset is hosted on a PowerBI Premium capacity, you may also reach out to your capacity administrator to see if the limit can be adjusted."

 

However, When I view the report size (Cog Icon>Manage Group Storage) it shows the report as being 386MB. The report seems to work fine, with data that is largely current. How can the refresh, which will only be pulling through a small proportion of additional data, be increasing the size from 386mb to 1484mb?

 

Assuming that the dataset really is that big, I've seen this page with recommendations on how to slim down the size of the data. How can I see whaich of my tables are the largest, so I have a chance of making impactful changes?

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi, @max_bradley 

Sometimes, the data model expands during a refresh, perhaps due to the addition of rows, columns, or changes to the data type. This can result in the dataset size exceeding the initial size you see in the storage settings. During the refresh process, Power BI might temporarily use more storage space to process the incoming data before finalizing the dataset. This can sometimes lead to the error you see. You can check the following posts:

Solved: Power BI dataset size limit - Microsoft Fabric Community

 

Solved: Power BI Size Limitation / Maximum - Microsoft Fabric Community

 

To identify which tables contribute the most to the dataset size. You can connect DAX Studio to your Power BI model and use the View Metrics feature to get more information about table size and column cardinality.

vyohuamsft_0-1727319715019.png

vyohuamsft_1-1727319749825.png

Or use DAX Studio to export your Power BI model as a .vpax file. Use VertiPaq Analyzer to analyze .vpax files and get detailed information about table and column sizes.

vyohuamsft_0-1727330469916.png

 

For more information about VertiPaq Analyzer, you can see the following links:

vyohuamsft_1-1727330751617.png

Data Model Size with VertiPaq Analyzer - SQLBI

 

Regarding reducing the dataset size, you can remove unnecessary columns and keep only the ones that are needed for your analysis. Or apply a filter to reduce the number of rows you import into Power BI. If you don't need detailed data, you can aggregate the data at a higher level of granularity.

 

How to Get Your Question Answered Quickly

Best Regards

Yongkang Hua

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi, @max_bradley 

Sometimes, the data model expands during a refresh, perhaps due to the addition of rows, columns, or changes to the data type. This can result in the dataset size exceeding the initial size you see in the storage settings. During the refresh process, Power BI might temporarily use more storage space to process the incoming data before finalizing the dataset. This can sometimes lead to the error you see. You can check the following posts:

Solved: Power BI dataset size limit - Microsoft Fabric Community

 

Solved: Power BI Size Limitation / Maximum - Microsoft Fabric Community

 

To identify which tables contribute the most to the dataset size. You can connect DAX Studio to your Power BI model and use the View Metrics feature to get more information about table size and column cardinality.

vyohuamsft_0-1727319715019.png

vyohuamsft_1-1727319749825.png

Or use DAX Studio to export your Power BI model as a .vpax file. Use VertiPaq Analyzer to analyze .vpax files and get detailed information about table and column sizes.

vyohuamsft_0-1727330469916.png

 

For more information about VertiPaq Analyzer, you can see the following links:

vyohuamsft_1-1727330751617.png

Data Model Size with VertiPaq Analyzer - SQLBI

 

Regarding reducing the dataset size, you can remove unnecessary columns and keep only the ones that are needed for your analysis. Or apply a filter to reduce the number of rows you import into Power BI. If you don't need detailed data, you can aggregate the data at a higher level of granularity.

 

How to Get Your Question Answered Quickly

Best Regards

Yongkang Hua

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

lbendlin
Super User
Super User

You need to keep in mind that these sizes are impacted by the compression level of the data.  As of now the 1484MB is the uncompressed size in memory, not the compressed size on disk. This may change in the future.

 

Use DAX Studio Metrics (fka Vertipaq Analyzer) to see your table statistics.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.