Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Anonymous
Not applicable

Practicalities of Working with Large Datasets

Hi guys, just reaching out to see what people's experience is with larger PBIX models.

We have a file c.750MB which we use as a master dataset.  So currently we have 2 layers,  a report PBIX connected to a dataset PBIX.
The problem were finding is the speed of use of the dataset PBIX.
One consideration is to have a third layer
Report PBIX
Datamodel (containing measures)
Dataset (without measures)
So essentially we would develop
our queries and relationships in the Dataset,
the measures in the Datamodel
the visuals (plus any local measures in the Report).

It does sound complex, but wondered if anyone else has any tips for improving performance.

1 REPLY 1
Anonymous
Not applicable

  • DAX works much better with narrower and longer tables, rather than shorter and wider ones.
  • Work on reducing the cardinality of the columns
    • Store date and time in separate columns
  • Use integers and keys whenever possible
  • Try to create a star schema if possible
  • when writing measures filter dimension tables, not your fact tables
  • remove any columns that you do not need or would be better off being calculated using measures
    • Except if there is some complex measures you are writing, then probably best to push these to a calculated column

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.