Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi Community Members,
We are currently handling a single semantic model close to 5GB containing only D365 CRM data. Once the ERP data is integrated, this dataset might grow to 10GB and even more, considering this will contain 10 years of historical data.
We are going to build reports for this dataset. Any way we can handle this dataset more efficiently in Power BI?
Thanks in advance!
Solved! Go to Solution.
Hi @Deepak_ ,
With larget semantic model, this size is not a problem per se https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models
Nevertheless you can apply best practices to keep your dataset small and fast. What is possible and reasonable for your dataset depends on what you already have optimized and what are your requirements.
Besides the official recommendations https://learn.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction I'd like to add:
BR
Martin
Hi @Deepak_ ,
With larget semantic model, this size is not a problem per se https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models
Nevertheless you can apply best practices to keep your dataset small and fast. What is possible and reasonable for your dataset depends on what you already have optimized and what are your requirements.
Besides the official recommendations https://learn.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction I'd like to add:
BR
Martin
Thanks @Martin_D ! This Helps.
My client wants to keep all of their data in one dataset. Hence, the data size is getting bigger. and the connection we are using for the reports is a live connection so if I perform some data cleaning for example removing some columns/rows that not has been used in the given report, other reports may be affected. So I want to look for any possible workaround.
You can use this tool to analyse all reports in the tenant which columns they actually use: https://en.brunner.bi/measurekiller
Local pbix files that users have not published are not captured. But even with existing reports in place, you can ask your users about their requirements and which columns they actually use. Of course, capturing all columns used, once you have made available everything, is harder than growing the dataset more restictively from the beginning.
Some of the concepts above can also be applied to exising datasets without affecting existing reports, like using integer type relationship columns.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 55 | |
| 31 | |
| 17 | |
| 14 |