Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello Community,
We currently use a vendor-created semantic model for our financial reporting. This model contains over 50 tables, including multiple fact tables and numerous dimensions like departments, accounts, vendors, and time.
While we appreciate the model's comprehensiveness, we’re wondering if this size is normal or if it could be optimized for performance and usability. Specifically, we’d like guidance on the following:
We’d love to hear your experiences or recommendations regarding vendor-designed models and how to approach optimizing them.
Thank you for your insights!
Solved! Go to Solution.
Hi @goldenarm253,
Based on my experience,
1. 50+ tables is not typically ideal for a semantic model unless lot of KPIs are dependent on all the tables and the data modelling is done in a good way(No complex relationships). Are you clubbing different business use cases in the sameSM? If so, there is also a possibility of exposing data that is not needed by specific persons who use this model. You would also have a hard time applying RLS. If the model will be used for self-service, the number of objects in current model can overwhelm the business user.
2. I would say consolidate the dimensions based on your requirement. Even with 50 tables, I don't think all columns would have been used. You can think about junk dimension if only 1-2 columns are used across each dimension table. You can also use Measure Killer to know how much of the model objects is actually being used.
3. Ideally, the semantic model would be designed based on the audience and the intended purpose. This would then help you to think about the level of granularity you need to have in the SM. Take into consideration the KPIs that will be created but don't create a model specific to those KPIs alone.
Hi @goldenarm253,
Thank you for reaching out to Microsoft Fabric Community Forum.
I suggest you to Focus on performance and user-friendliness rather than size.
1. Is it typical for semantic models to include 50+ tables?
Yes, Large models are often seen in financial reporting as they need to cover various dimensions and fact tables to support a wide range of reporting needs.
2. Should you simplify the model?
Combine less frequently used, detailed dimensions or create summary tables to streamline the model and improve efficiency.
3. Best practices for managing vendor-created models:
Identify and remove unused tables, columns, and measures that may have been included, and avoid direct filtering in Power BI, as it can impact performance. Instead, implement filtering within measures to maintain efficiency.
If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!
Regards,
Vinay Pabbu
I'd say:
- remove all the data you don't need
- denormalize dimension hierarchies into a single dimension table (avoid snowflakes, use star schema instead)
- fact tables: it depends on what fact granularity you need for reporting
Remove all data you don't need. Remove tables, columns and rows that you don't need.
The number of tables is not the only important thing. The types of relationships, and model layout (preferable start schema or multi-star schema) are important for performance.
Is the semantic model's storage mode Import Mode, DirectQuery or Direct Lake?
Hi @goldenarm253,
Based on my experience,
1. 50+ tables is not typically ideal for a semantic model unless lot of KPIs are dependent on all the tables and the data modelling is done in a good way(No complex relationships). Are you clubbing different business use cases in the sameSM? If so, there is also a possibility of exposing data that is not needed by specific persons who use this model. You would also have a hard time applying RLS. If the model will be used for self-service, the number of objects in current model can overwhelm the business user.
2. I would say consolidate the dimensions based on your requirement. Even with 50 tables, I don't think all columns would have been used. You can think about junk dimension if only 1-2 columns are used across each dimension table. You can also use Measure Killer to know how much of the model objects is actually being used.
3. Ideally, the semantic model would be designed based on the audience and the intended purpose. This would then help you to think about the level of granularity you need to have in the SM. Take into consideration the KPIs that will be created but don't create a model specific to those KPIs alone.