Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The ultimate Microsoft Fabric, Power BI, Azure AI & SQL learning event! Join us in Las Vegas from March 26-28, 2024. Use code MSCUST for a $100 discount. Register Now

Reply
martinmaloc
New Member

Semantic model to large

Hi everyone, I would be very grateful if someone could help me whith some doubst I have when uploading my BI reports to the Power BI Service. I have a Pro licence and recently I've been working on a project that has been making my model to got bigger and bigger, and the last time a tried to publish on service I had a warning were my model is to big and I have to upgrade my licence to Premium.

 

My issue is this, I have compared my model to other models that are 2 or 3 times bigger than mine and they don't have problem. As a fact, all my tables are directly imported from excel sheets, one of them has 30k rows another 15k rows and then I have like 3 more tables with 5k rows. So, the problem is importing data from excel sheets? 

 

This is my first time developing BI reports as a job, I don't have a programming or data base engineer background so I might be doing many mistakes.

 

If someone has suggestions or some idea of what might be happening I would apreciate very much the help.

1 ACCEPTED SOLUTION
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

1 REPLY 1
edhans
Super User
Super User

Model size isn't based on the number of records but the kind of data. A table with 2M rows and 4 columns can be smaller than a table with 100K records and 200 columns.

 

It also depends on the data in the columns. Integers store very well. Long LONG text strings do not.

 

You need to get DAX Studio and use the Veritipaq Analyzer tool within it. https://daxstudio.org

 

The second video here on Performance Tuning will show you how to use that tool. https://daxstudio.org/docs/Videos/

 

From there you can figure out what you need to do to reduce it, or find out you cannot remove anything and upgrade to a Premium Per User license that allows semantic models to grow to 100GB vs 1GB for Pro.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Fabric Community Conference

Microsoft Fabric Community Conference

Join us at our first-ever Microsoft Fabric Community Conference, March 26-28, 2024 in Las Vegas with 100+ sessions by community experts and Microsoft engineering.

February 2024 Update Carousel

Power BI Monthly Update - February 2024

Check out the February 2024 Power BI update to learn about new features.

Fabric Career Hub

Microsoft Fabric Career Hub

Explore career paths and learn resources in Fabric.

Fabric Partner Community

Microsoft Fabric Partner Community

Engage with the Fabric engineering team, hear of product updates, business opportunities, and resources in the Fabric Partner Community.

Top Solution Authors
Top Kudoed Authors