Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello Community,
I have a master model which includes all of my company's data to be used internally, this model is ~500 MB in size with tables in import from Google big Query.
I have created a new model with just the two fact tables from the same model with maybe 5-10% more data because of infrequent data refresh on the master model exactly the same amount of rows
The problem is, the new model comes out ~1GB in size - actually I am less than 100 KB away from breaking the 1GB limit for Pro users.
I have no idea how this can happen? It's the same data and there is without a doubt less data in the model taking up 1 GB.
The 1 GB one doesn't even have measures, visualizations etc while the small one has 100+ pages and several 100 measures plus many dimensions
I need to find a solution otherwise I will loose my ability to upload the model when it break 1 GB
Has anyone experienced something similar?
EDIT: Any tips or thoughts on how to debug this would also be welcome
Solved! Go to Solution.
I was a bit too hasty making this post, the issue was a single column contributing to +600 MB. I identified the issue using Vertipaq analyzer in Tabular Editor 3 and was able to single out this column.
Sorry for any inconvenience to anyone who might have tried looking into this.
I was a bit too hasty making this post, the issue was a single column contributing to +600 MB. I identified the issue using Vertipaq analyzer in Tabular Editor 3 and was able to single out this column.
Sorry for any inconvenience to anyone who might have tried looking into this.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
110 | |
95 | |
76 | |
65 | |
51 |
User | Count |
---|---|
146 | |
109 | |
106 | |
88 | |
61 |