Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now

undefinedHandling Large Source Tables with Many Columns – Model Size Limitation in Power BI

Hi everyone,

I’m working with a source system that has a very large number of columns and tables. Due to Power BI model size and performance limitations, I cannot bring all columns into the data model.

Current challenges:

  • High column count (wide tables)

  • Large row volume

  • Import mode hitting memory limits

  • Performance degradation when loading full schema

I need most of the data for analytical flexibility, but including everything causes model size and refresh issues.

I’d appreciate guidance on:

  1. Best practice for handling very wide source tables

  2. Whether to:

    • Split fact tables?

    • Normalize in Power BI?

    • Use DirectQuery?

    • Move transformations upstream (SQL/Dataflow)?

  3. Strategies for reducing model size without losing analytical capability

  4. Experiences using Composite Models for this scenario

Would love to hear architectural recommendations from people who handled similar enterprise-scale datasets.

@PowerBI@MicrosoftPBI 

Status: New