Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi,
Does it affect to PBI query loading/refreshing time?
If so how can we clean it up?
one of them is over 6gb but its .pbix is just 300mb around.
fyi, I've cleared cash/data already.
Solved! Go to Solution.
I would suggest optimizing data models, specifically for that super large model:)
Several reasons why the semantic models got inflated in Power BI services, one of the major reasons would be:
Columnar Storage Expansion. Power BI Service converts your model into a highly optimized VertiPaq columnar database. While this format is compressed for query performance, metadata (dictionaries, hierarchies, relationships) can inflate storage by 2–3× compared to the desktop file.
There are some other reasons like Data Partitioning etc.
Regarding how to reduce the size of your model, you would need to optimize them in desktop tool with a couple of measures below:
- Remove unused columns, hide unused tables.
- Use integer/date keys instead of strings.
- Avoid high-cardinality columns (e.g., free-text).
- Switch to DirectQuery for large tables if feasible.
these are very general approaches from internet and I'd recommend you refer to SQLBI for detailed guidance and best practices.
https://www.sqlbi.com/articles/optimizing-high-cardinality-columns-in-vertipaq/
Hi @jeongkim ,
Thank you for reaching out to the Microsoft Community Forum.
System storage refers to the space used by datasets, semantic models, reports, and other artifacts within a workspace. It has Owned storage (items you created) and Shared storage (items shared with you).
Question: Does System Storage Affect Query Loading/Refreshing Time?
Solution: Yes, Large semantic models consume more memory and processing power during refreshes and query execution. Complex models with many relationships or calculated columns/measures can slow down performance. Storage limits might not directly hold the refresh speed, but they can affect workspace and lead to failures if exceeded.
Please try below things to fix the issue.
1. Remove unused tables and Avoid calculated columns and use measures.
2. Enable incremental refresh to avoid full data loads each time.
3. Break the large models into smaller models.
4. Delete unused datasets or reports. Archive older versions externally.
5. If the large model is shared across workspaces, consider centralizing it and use live connections.
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
HI @jeongkim ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Thank you.
HI @jeongkim ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Thank you.
HI @jeongkim ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Thank you.
I would suggest optimizing data models, specifically for that super large model:)
Several reasons why the semantic models got inflated in Power BI services, one of the major reasons would be:
Columnar Storage Expansion. Power BI Service converts your model into a highly optimized VertiPaq columnar database. While this format is compressed for query performance, metadata (dictionaries, hierarchies, relationships) can inflate storage by 2–3× compared to the desktop file.
There are some other reasons like Data Partitioning etc.
Regarding how to reduce the size of your model, you would need to optimize them in desktop tool with a couple of measures below:
- Remove unused columns, hide unused tables.
- Use integer/date keys instead of strings.
- Avoid high-cardinality columns (e.g., free-text).
- Switch to DirectQuery for large tables if feasible.
these are very general approaches from internet and I'd recommend you refer to SQLBI for detailed guidance and best practices.
https://www.sqlbi.com/articles/optimizing-high-cardinality-columns-in-vertipaq/
System storage in Power BI refers to temporary files, cache, and other data that Power BI stores locally to improve performance.These files help speed up report loading and interactions but can accumulate over time, potentially slowing down Power BI.
Does System Storage Affect Query Loading/Refreshing Time?
Yes, excessive system storage can impact:
Performance: A bloated cache can slow down Power BI Desktop when opening, refreshing, or interacting with reports.
Memory Usage: Cached data consumes RAM, leading to slower operations.
Refresh Issues: Corrupted cache files may cause refresh failures or errors.
How to clean up system storage in POwer Bi. Clear cache in Power Bi Desktop.
Go to File -> Options and Settings -> Options -> Global -> Data Load
Click "Clear Cache" to remove temporary query data.