Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello Community,
I have a master model which includes all of my company's data to be used internally, this model is ~500 MB in size with tables in import from Google big Query.
I have created a new model with just the two fact tables from the same model with maybe 5-10% more data because of infrequent data refresh on the master model exactly the same amount of rows
The problem is, the new model comes out ~1GB in size - actually I am less than 100 KB away from breaking the 1GB limit for Pro users.
I have no idea how this can happen? It's the same data and there is without a doubt less data in the model taking up 1 GB.
The 1 GB one doesn't even have measures, visualizations etc while the small one has 100+ pages and several 100 measures plus many dimensions
I need to find a solution otherwise I will loose my ability to upload the model when it break 1 GB
Has anyone experienced something similar?
EDIT: Any tips or thoughts on how to debug this would also be welcome
Solved! Go to Solution.
I was a bit too hasty making this post, the issue was a single column contributing to +600 MB. I identified the issue using Vertipaq analyzer in Tabular Editor 3 and was able to single out this column.
Sorry for any inconvenience to anyone who might have tried looking into this.
I was a bit too hasty making this post, the issue was a single column contributing to +600 MB. I identified the issue using Vertipaq analyzer in Tabular Editor 3 and was able to single out this column.
Sorry for any inconvenience to anyone who might have tried looking into this.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!