Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
martinmaloc
New Member

Semantic model to large

Hi everyone, I would be very grateful if someone could help me whith some doubst I have when uploading my BI reports to the Power BI Service. I have a Pro licence and recently I've been working on a project that has been making my model to got bigger and bigger, and the last time a tried to publish on service I had a warning were my model is to big and I have to upgrade my licence to Premium.

 

My issue is this, I have compared my model to other models that are 2 or 3 times bigger than mine and they don't have problem. As a fact, all my tables are directly imported from excel sheets, one of them has 30k rows another 15k rows and then I have like 3 more tables with 5k rows. So, the problem is importing data from excel sheets? 

 

This is my first time developing BI reports as a job, I don't have a programming or data base engineer background so I might be doing many mistakes.

 

If someone has suggestions or some idea of what might be happening I would apreciate very much the help.

1 ACCEPTED SOLUTION
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

1 REPLY 1
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Sept PBI Carousel

Power BI Monthly Update - September 2024

Check out the September 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors