The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi Champs,
I am having a heavy Power BI file around 2.5 GB in size. The reason I am having this big file because I have imported around 118M worth of records.
Problem Statement: It has started failing refresh on service and I am trying to reduce the number of records to improve data refresh time.
Questions for Optimization:
1. Does it impact to have 118M records, also technically what affects the report load time performance more i.e. rows or columns. What if I reduce the number of rows and increase the columns? How thil will impact the report load time performance.
2. Scenario: I have imported more than 100 columns and deleting some of them in query editor. Also I have planned to apply some aggregations to reduce the number of records. Will it help to improve the data refresh time? Currently it's getting timed out as it's running for more than 5 hours (maximum capacity on premium workspace).
3. What else can be done to improve the report refresh time.
Any help would be appreciated since this has become a show stopper for me.
Regards,
Ashok Yadav
Solved! Go to Solution.
What's the data source?
Are the Fact tables at the appropriate level of aggregation i.e. do you need individual transaction details or can they be summarized?
Push as much processing as you can to the source system.
Is Query Folding working?
Are you using incremental refresh?
Switch off auto/date time setting.
Powerbi's compression works well on columns with low cardinality.
Don't import columns you don't need.
Try to remove text fields from the Fact tables. Use appropriate data types e.g. don't use decimal when whole number will do
Use Vertipaq Analyzer
good luck
What's the data source?
Are the Fact tables at the appropriate level of aggregation i.e. do you need individual transaction details or can they be summarized?
Push as much processing as you can to the source system.
Is Query Folding working?
Are you using incremental refresh?
Switch off auto/date time setting.
Powerbi's compression works well on columns with low cardinality.
Don't import columns you don't need.
Try to remove text fields from the Fact tables. Use appropriate data types e.g. don't use decimal when whole number will do
Use Vertipaq Analyzer
good luck
Hi Dear,
I am having a scenario where my data model size is around 700 MB. I am trying to optimize it but since I can't install DAX studio or Vertipaq analyzer, I don't know which column or table is taking more space. Can you help me with some best practices for optimizing data model?
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
115 | |
79 | |
78 | |
44 | |
38 |
User | Count |
---|---|
150 | |
116 | |
68 | |
64 | |
58 |