Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
When I try to refresh the Semantic Model behind a report, I get the following error;
"Unable to save the changes since the new dataset of size 1484 MB exceeds the limit of 1024 MB. If the dataset is hosted on a PowerBI Premium capacity, you may also reach out to your capacity administrator to see if the limit can be adjusted."
However, When I view the report size (Cog Icon>Manage Group Storage) it shows the report as being 386MB. The report seems to work fine, with data that is largely current. How can the refresh, which will only be pulling through a small proportion of additional data, be increasing the size from 386mb to 1484mb?
Assuming that the dataset really is that big, I've seen this page with recommendations on how to slim down the size of the data. How can I see whaich of my tables are the largest, so I have a chance of making impactful changes?
Solved! Go to Solution.
Hi, @max_bradley
Sometimes, the data model expands during a refresh, perhaps due to the addition of rows, columns, or changes to the data type. This can result in the dataset size exceeding the initial size you see in the storage settings. During the refresh process, Power BI might temporarily use more storage space to process the incoming data before finalizing the dataset. This can sometimes lead to the error you see. You can check the following posts:
Solved: Power BI dataset size limit - Microsoft Fabric Community
Solved: Power BI Size Limitation / Maximum - Microsoft Fabric Community
To identify which tables contribute the most to the dataset size. You can connect DAX Studio to your Power BI model and use the View Metrics feature to get more information about table size and column cardinality.
Or use DAX Studio to export your Power BI model as a .vpax file. Use VertiPaq Analyzer to analyze .vpax files and get detailed information about table and column sizes.
For more information about VertiPaq Analyzer, you can see the following links:
Data Model Size with VertiPaq Analyzer - SQLBI
Regarding reducing the dataset size, you can remove unnecessary columns and keep only the ones that are needed for your analysis. Or apply a filter to reduce the number of rows you import into Power BI. If you don't need detailed data, you can aggregate the data at a higher level of granularity.
How to Get Your Question Answered Quickly
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, @max_bradley
Sometimes, the data model expands during a refresh, perhaps due to the addition of rows, columns, or changes to the data type. This can result in the dataset size exceeding the initial size you see in the storage settings. During the refresh process, Power BI might temporarily use more storage space to process the incoming data before finalizing the dataset. This can sometimes lead to the error you see. You can check the following posts:
Solved: Power BI dataset size limit - Microsoft Fabric Community
Solved: Power BI Size Limitation / Maximum - Microsoft Fabric Community
To identify which tables contribute the most to the dataset size. You can connect DAX Studio to your Power BI model and use the View Metrics feature to get more information about table size and column cardinality.
Or use DAX Studio to export your Power BI model as a .vpax file. Use VertiPaq Analyzer to analyze .vpax files and get detailed information about table and column sizes.
For more information about VertiPaq Analyzer, you can see the following links:
Data Model Size with VertiPaq Analyzer - SQLBI
Regarding reducing the dataset size, you can remove unnecessary columns and keep only the ones that are needed for your analysis. Or apply a filter to reduce the number of rows you import into Power BI. If you don't need detailed data, you can aggregate the data at a higher level of granularity.
How to Get Your Question Answered Quickly
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
You need to keep in mind that these sizes are impacted by the compression level of the data. As of now the 1484MB is the uncompressed size in memory, not the compressed size on disk. This may change in the future.
Use DAX Studio Metrics (fka Vertipaq Analyzer) to see your table statistics.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 56 | |
| 35 | |
| 18 | |
| 14 |