Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
ArchStanton
Impactful Individual
Impactful Individual

PBIX file size

Hi,

 

I inherited a pbix file 3 years ago that was 13GB in size, now it is 16GB.

It has grown due to many additional calculated columns and an ever growing datatable - from 39,000 rows in 2022 to 45,000 in Nov 2024. 

Is this something that I should be concerned about?

 

The pbix file is directly connected to a live Dynamics database.

 

Thanks

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @ArchStanton ,

 

Is 13GB local pbix file size or  PBI Memory Costs? The 16GB file size is more than the limit for direct publish. But from the other info, the amount of data doesn't seem to be that large, and from experience, PBI can easily handle a table of 45,000 records, And the data model for this volume should not be 10GB.🤔


So I think as a first step you may need to evaluate the overhead of the model with some external tool. 
Analyze Model - SQLBI Docs
Videos | DAX Studio

And base on this to optimize model.

vcgaomsft_0-1731996121718.png

Data reduction techniques for Import modeling - Power BI | Microsoft Learn
Optimization guide for Power BI - Power BI | Microsoft Learn

For some calculated columns and calculated tables, try to build them in the data source/upstream as well if you can, which will reduce the memory expense for PBI.
Others may heleful:
Optimize DAX queries in Power BI (Top 21 Ways) | Edu4Sure

 

Best Regards,
Gao

Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

How to get your questions answered quickly --  How to provide sample data in the Power BI Forum

View solution in original post

10 REPLIES 10
Anonymous
Not applicable

Hi @ArchStanton ,

 

Is 13GB local pbix file size or  PBI Memory Costs? The 16GB file size is more than the limit for direct publish. But from the other info, the amount of data doesn't seem to be that large, and from experience, PBI can easily handle a table of 45,000 records, And the data model for this volume should not be 10GB.🤔


So I think as a first step you may need to evaluate the overhead of the model with some external tool. 
Analyze Model - SQLBI Docs
Videos | DAX Studio

And base on this to optimize model.

vcgaomsft_0-1731996121718.png

Data reduction techniques for Import modeling - Power BI | Microsoft Learn
Optimization guide for Power BI - Power BI | Microsoft Learn

For some calculated columns and calculated tables, try to build them in the data source/upstream as well if you can, which will reduce the memory expense for PBI.
Others may heleful:
Optimize DAX queries in Power BI (Top 21 Ways) | Edu4Sure

 

Best Regards,
Gao

Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

How to get your questions answered quickly --  How to provide sample data in the Power BI Forum

Thank you for your help

Thank you so much for this information. The 16GB is the actual pbix file size 

 

ArchStanton_0-1732007423709.png

I'm not actually having any trouble publishing this to the shared workspace which is good, this despite there being 40 queries in the data model. Some of my measures are quite slow (3-4 secs) but it doesn't matter as the model only refreshes once a day at midnight.

 

ArchStanton_1-1732007804301.png

ArchStanton_2-1732007872087.png

As I mentioned originally, I inherited this data model, its extremely complicated and I dare not try to pull it apart and re-build it as that is way above my pay grade! I will however look at your recommendations and look at stripping out unused columns, rows etc.

Anonymous
Not applicable

Hi @ArchStanton ,

 

16521 kb ≈ 16.13 mb (not Gb). 

From your description it looks like maybe some of the measure are not performing well, you can start a new thread to discuss how to go about optimizing the Dax query of the measures in your case.

 

The next step you may need:

Use tools to optimize Power BI performance - Training | Microsoft Learn

Optimization - SQLBI

 

Best Regards,
Gao

Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

How to get your questions answered quickly --  How to provide sample data in the Power BI Forum

Thank you for your help!

Jai-Rathinavel
Super User
Super User

Hi @ArchStanton , Model Size can impact performance of your reports in the longer run.

1. Are you actually displaying last 3 years of data in the current report ? If not, you can put a filter at query or db level to pull only the relevant years data. This will help in reducing the size of the model

2. Try reducing the number of calculated columns by switching the relevant logics to a measure.

3. Instead of having lot of calculated and data tables, try to create tables at power query level. You can also migrate your current Data table to a csv storage for maximum file compression.

4. Analyze your model's column temperatures using INFO.STORAGETABLECOLUMNS() to check the columns which are in use. So that you can drop unnecessary columns to reduce the model size.

5. Disable Time Intelliegence to see a noticeable size reduction in the current model.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





Thank you for your help!

Hi,
Thanks for your response, the option most suitable for me is Option 4 (using INFO.STORAGETABLECOLUMNS()(

I've never heard of this and so don't know here to start - can you give me some basic advice on what it is and how I go about applying it please?

@ArchStanton , Enable the below setting for your semantic model and follow the below blog. The blog is focussed for direct lake model but it will apply for Large storage semantic models too.

 

JaiRathinavel_0-1731950213588.png

https://data-marc.com/2023/09/28/understanding-data-temperature-with-direct-lake-in-fabric/

 

Did I answer your question ? If yes, please mark this post as a solution

 

Thanks,

Jai




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





Thanks, I'll try this at some point this week and let you know if it worked.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors