Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.
Hi Experts!
I have a new model I have been building, and it keeps ballooning in size. It is nearly 100 MB, and I have no idea why. It is only 10M rows of data. I have another model that has about 4M rows of data and it is only 10 MB. So why the order of magnitude difference even though the fact tables aren't an order of magnitude different?
How does one go about determining which table(s) is causing the size to balloon and potentially what can one do to reduce the footprint short of pruning the data being loaded. Note, I am using Import, not Direct Query.
Thanks!
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the community members for the issue worked. If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Thanks
hi @WishAskedSooner few things you can follow to reduce the size
1) Check the unwanted columns delete them and avoid creation of calculated columns if not necessary
2) Check the transformation steps in the Power query make sure steps are query folded
3) Disable MDX from Tabular editor if its not required it will also reduce the storage
Hi @WishAskedSooner
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If our responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @WishAskedSooner ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Thanks for the feedback. All good thoughts. It's been a while since I used DAX Studio mainly because I never really needed it until now.
Dax studio and vertipaq analyser. This will give you size and cardinality of each tables, column etc
Things that normally increase a semantic model's footprint:
I would consider using Measure Killer tool to identify elements in your model that can be removed.
https://www.brunner.bi/measurekiller-analytics
https://www.brunner.bi/measurekiller
User | Count |
---|---|
84 | |
73 | |
73 | |
58 | |
51 |
User | Count |
---|---|
43 | |
41 | |
36 | |
34 | |
30 |