Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
Anonymous
Not applicable

Practicalities of Working with Large Datasets

Hi guys, just reaching out to see what people's experience is with larger PBIX models.

We have a file c.750MB which we use as a master dataset.  So currently we have 2 layers,  a report PBIX connected to a dataset PBIX.
The problem were finding is the speed of use of the dataset PBIX.
One consideration is to have a third layer
Report PBIX
Datamodel (containing measures)
Dataset (without measures)
So essentially we would develop
our queries and relationships in the Dataset,
the measures in the Datamodel
the visuals (plus any local measures in the Report).

It does sound complex, but wondered if anyone else has any tips for improving performance.

1 REPLY 1
Anonymous
Not applicable

  • DAX works much better with narrower and longer tables, rather than shorter and wider ones.
  • Work on reducing the cardinality of the columns
    • Store date and time in separate columns
  • Use integers and keys whenever possible
  • Try to create a star schema if possible
  • when writing measures filter dimension tables, not your fact tables
  • remove any columns that you do not need or would be better off being calculated using measures
    • Except if there is some complex measures you are writing, then probably best to push these to a calculated column

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.