cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.

Reply
Vind1989
Helper III
Helper III

Size limitation best practices for dim tables

Hi all,

 

When building out a star schema in PBI how small should the Dim & Fact tables be (Size and rough amount rows) for the best performace?

 

How small should I keep the entire file for the best performace?

 

Thank you

1 ACCEPTED SOLUTION
v-stephen-msft
Community Support
Community Support

Hi @Vind1989 ,

 

As @Greg_Deckler  said, the smaller it is, the better the performance, but the best performance does not only depend on the size.

To answer in another way, you can say that as long as it does not exceed the maximum limit, it can run normally. 

 

There is a 1 GB limit per dataset that is imported into Power BI, unless the workspace is in a Power BI Premium capacity. See the Large datasets section of What is Power BI Premium? for details. 

With import mode, you have a limitation on the size of the model. Your Power BI model (or let’s say the file) cannot be more than 1GB. You have usually up to 10GB size in your account, however, every file should be up to 1GB of the size. There is an exception for this; Power BI Premium allows you to have up to 50GB size of model loaded in Power BI website. without Premium maximum file size you can have is 1GB. However, remember that 1GB in the Power BI file is not equal to 1GB data in the source (As mentioned in the compression engine section).

 

Manage data storage in your workspaces - Power BI | Microsoft Docs

DirectQuery, Live Connection or Import Data? Tough Decision! - RADACAD

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

4 REPLIES 4
v-stephen-msft
Community Support
Community Support

Hi @Vind1989 ,

 

As @Greg_Deckler  said, the smaller it is, the better the performance, but the best performance does not only depend on the size.

To answer in another way, you can say that as long as it does not exceed the maximum limit, it can run normally. 

 

There is a 1 GB limit per dataset that is imported into Power BI, unless the workspace is in a Power BI Premium capacity. See the Large datasets section of What is Power BI Premium? for details. 

With import mode, you have a limitation on the size of the model. Your Power BI model (or let’s say the file) cannot be more than 1GB. You have usually up to 10GB size in your account, however, every file should be up to 1GB of the size. There is an exception for this; Power BI Premium allows you to have up to 50GB size of model loaded in Power BI website. without Premium maximum file size you can have is 1GB. However, remember that 1GB in the Power BI file is not equal to 1GB data in the source (As mentioned in the compression engine section).

 

Manage data storage in your workspaces - Power BI | Microsoft Docs

DirectQuery, Live Connection or Import Data? Tough Decision! - RADACAD

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Greg_Deckler
Super User
Super User

@Vind1989 Well, the answer is kind of as small as possible. There is a lot to unpack in that question. One thing to do is to use the Vertipaq Analyzer and find and get rid of high-cardinality columns. There are other things you can do to speed things up such as Aggregation Tables depending on your model. Is this an import, DirectQuery or composite model? This is a broad topic area.


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

@Greg_Deckler  this will be a import model, it kind of an interesting situation. My team has to pump out  a report but the data is messy so the DB admin wants to push a bunch of dim tables then let me manage the relationships. The admin is asking me on how large the dim tables can be for good performace and I am not sure how to answer it. 

What sort of data are you dealing with. 

I'd agree with @Greg_Deckler about vertipak analyser.

 

I'd suggest getting hold of the data and seeing how it looks. Provided they're proper Dim  tables and you can form 1 to many relationships to a fact table which just has the foreign keys and numeric facts on it Power Bi is very capable of handling very large datasets. The compression it achieves along with the speed never fails to amaze me!

Once you have the data you can always shape it further in power query to reduce size. (Remove unused columns especially in your fact table and unless you have a specific need for floating point switch everything to fixed decimals).



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
PBI November 2023 Update Carousel

Power BI Monthly Update - November 2023

Check out the November 2023 Power BI update to learn about new features.

Community News

Fabric Community News unified experience

Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.

Dashboard in a day with date

Exclusive opportunity for Women!

Join us for a free, hands-on Microsoft workshop led by women trainers for women where you will learn how to build a Dashboard in a Day!

Power BI Fabric Summit Carousel

The largest Power BI and Fabric virtual conference

130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.

Top Solution Authors
Top Kudoed Authors