Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
ashok_yadav
Helper I
Helper I

Data Model Optimization

Hi Champs,

 

I am having a heavy Power BI file around 2.5 GB in size. The reason I am having this big file because I have imported around 118M worth of records.

 

Problem Statement: It has started failing refresh on service and I am trying to reduce the number of records to improve data refresh time.

 

Questions for Optimization:

1. Does it impact to have 118M records, also technically what affects the report load time performance more i.e. rows or columns. What if I reduce the number of rows and increase the columns? How thil will impact the report load time performance.

2. Scenario: I have imported more than 100 columns and deleting some of them in query editor. Also I have planned to apply some aggregations to reduce the number of records. Will it help to improve the data refresh time? Currently it's getting timed out as it's running for more than 5 hours (maximum capacity on premium workspace).

3.  What else can be done to improve the report refresh time.

 

Any help would be appreciated since this has become a show stopper for me.

 

Regards,

Ashok Yadav

 

 

1 ACCEPTED SOLUTION
HotChilli
Super User
Super User

What's the data source?

Are the Fact tables at the appropriate level of aggregation i.e.  do you need individual transaction details or can they be summarized?

Push as much processing as you can to the source system.

Is Query Folding working?

Are you using incremental refresh?

Switch off auto/date time setting.

Powerbi's compression works well on columns with low cardinality.

Don't import columns you don't need.

Try to remove text fields from the Fact tables.  Use appropriate data types e.g. don't use decimal when whole number will do

Use Vertipaq Analyzer

 

good luck

View solution in original post

2 REPLIES 2
HotChilli
Super User
Super User

What's the data source?

Are the Fact tables at the appropriate level of aggregation i.e.  do you need individual transaction details or can they be summarized?

Push as much processing as you can to the source system.

Is Query Folding working?

Are you using incremental refresh?

Switch off auto/date time setting.

Powerbi's compression works well on columns with low cardinality.

Don't import columns you don't need.

Try to remove text fields from the Fact tables.  Use appropriate data types e.g. don't use decimal when whole number will do

Use Vertipaq Analyzer

 

good luck

Hi Dear, 

I am having a scenario where my data model size is around 700 MB. I am trying to optimize it but since I can't install DAX studio or Vertipaq analyzer, I don't know which column or table is taking more space. Can you help me with some best practices for optimizing data model?

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

AugPowerBI_Carousel

Power BI Monthly Update - August 2024

Check out the August 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.