Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Size limitation best practices for dim tables

Hi all,

 

When building out a star schema in PBI how small should the Dim & Fact tables be (Size and rough amount rows) for the best performace?

 

How small should I keep the entire file for the best performace?

 

Thank you

1 ACCEPTED SOLUTION
v-stephen-msft
Community Support
Community Support

Hi @Anonymous ,

 

As @Greg_Deckler  said, the smaller it is, the better the performance, but the best performance does not only depend on the size.

To answer in another way, you can say that as long as it does not exceed the maximum limit, it can run normally. 

 

There is a 1 GB limit per dataset that is imported into Power BI, unless the workspace is in a Power BI Premium capacity. See the Large datasets section of What is Power BI Premium? for details. 

With import mode, you have a limitation on the size of the model. Your Power BI model (or let’s say the file) cannot be more than 1GB. You have usually up to 10GB size in your account, however, every file should be up to 1GB of the size. There is an exception for this; Power BI Premium allows you to have up to 50GB size of model loaded in Power BI website. without Premium maximum file size you can have is 1GB. However, remember that 1GB in the Power BI file is not equal to 1GB data in the source (As mentioned in the compression engine section).

 

Manage data storage in your workspaces - Power BI | Microsoft Docs

DirectQuery, Live Connection or Import Data? Tough Decision! - RADACAD

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

4 REPLIES 4
v-stephen-msft
Community Support
Community Support

Hi @Anonymous ,

 

As @Greg_Deckler  said, the smaller it is, the better the performance, but the best performance does not only depend on the size.

To answer in another way, you can say that as long as it does not exceed the maximum limit, it can run normally. 

 

There is a 1 GB limit per dataset that is imported into Power BI, unless the workspace is in a Power BI Premium capacity. See the Large datasets section of What is Power BI Premium? for details. 

With import mode, you have a limitation on the size of the model. Your Power BI model (or let’s say the file) cannot be more than 1GB. You have usually up to 10GB size in your account, however, every file should be up to 1GB of the size. There is an exception for this; Power BI Premium allows you to have up to 50GB size of model loaded in Power BI website. without Premium maximum file size you can have is 1GB. However, remember that 1GB in the Power BI file is not equal to 1GB data in the source (As mentioned in the compression engine section).

 

Manage data storage in your workspaces - Power BI | Microsoft Docs

DirectQuery, Live Connection or Import Data? Tough Decision! - RADACAD

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Greg_Deckler
Super User
Super User

@Anonymous Well, the answer is kind of as small as possible. There is a lot to unpack in that question. One thing to do is to use the Vertipaq Analyzer and find and get rid of high-cardinality columns. There are other things you can do to speed things up such as Aggregation Tables depending on your model. Is this an import, DirectQuery or composite model? This is a broad topic area.


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
The Definitive Guide to Power Query (M)

DAX is easy, CALCULATE makes DAX hard...
Anonymous
Not applicable

@Greg_Deckler  this will be a import model, it kind of an interesting situation. My team has to pump out  a report but the data is messy so the DB admin wants to push a bunch of dim tables then let me manage the relationships. The admin is asking me on how large the dim tables can be for good performace and I am not sure how to answer it. 

What sort of data are you dealing with. 

I'd agree with @Greg_Deckler about vertipak analyser.

 

I'd suggest getting hold of the data and seeing how it looks. Provided they're proper Dim  tables and you can form 1 to many relationships to a fact table which just has the foreign keys and numeric facts on it Power Bi is very capable of handling very large datasets. The compression it achieves along with the speed never fails to amaze me!

Once you have the data you can always shape it further in power query to reduce size. (Remove unused columns especially in your fact table and unless you have a specific need for floating point switch everything to fixed decimals).



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.