Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
martinmaloc
New Member

Semantic model to large

Hi everyone, I would be very grateful if someone could help me whith some doubst I have when uploading my BI reports to the Power BI Service. I have a Pro licence and recently I've been working on a project that has been making my model to got bigger and bigger, and the last time a tried to publish on service I had a warning were my model is to big and I have to upgrade my licence to Premium.

 

My issue is this, I have compared my model to other models that are 2 or 3 times bigger than mine and they don't have problem. As a fact, all my tables are directly imported from excel sheets, one of them has 30k rows another 15k rows and then I have like 3 more tables with 5k rows. So, the problem is importing data from excel sheets? 

 

This is my first time developing BI reports as a job, I don't have a programming or data base engineer background so I might be doing many mistakes.

 

If someone has suggestions or some idea of what might be happening I would apreciate very much the help.

1 ACCEPTED SOLUTION
edhans
Community Champion
Community Champion

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

1 REPLY 1
edhans
Community Champion
Community Champion

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

January Power BI Update Carousel

Power BI Monthly Update - January 2026

Check out the January 2026 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.