Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.

Reply
martinmaloc
New Member

Semantic model to large

Hi everyone, I would be very grateful if someone could help me whith some doubst I have when uploading my BI reports to the Power BI Service. I have a Pro licence and recently I've been working on a project that has been making my model to got bigger and bigger, and the last time a tried to publish on service I had a warning were my model is to big and I have to upgrade my licence to Premium.

 

My issue is this, I have compared my model to other models that are 2 or 3 times bigger than mine and they don't have problem. As a fact, all my tables are directly imported from excel sheets, one of them has 30k rows another 15k rows and then I have like 3 more tables with 5k rows. So, the problem is importing data from excel sheets? 

 

This is my first time developing BI reports as a job, I don't have a programming or data base engineer background so I might be doing many mistakes.

 

If someone has suggestions or some idea of what might be happening I would apreciate very much the help.

1 ACCEPTED SOLUTION
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

1 REPLY 1
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
May PBI 25 Carousel

Power BI Monthly Update - May 2025

Check out the May 2025 Power BI update to learn about new features.

Notebook Gallery Carousel1

NEW! Community Notebooks Gallery

Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors