Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hi everyone,
I’m working with a source system that has a very large number of columns and tables. Due to Power BI model size and performance limitations, I cannot bring all columns into the data model.
Current challenges:
High column count (wide tables)
Large row volume
Import mode hitting memory limits
Performance degradation when loading full schema
I need most of the data for analytical flexibility, but including everything causes model size and refresh issues.
I’d appreciate guidance on:
Best practice for handling very wide source tables
Whether to:
Split fact tables?
Normalize in Power BI?
Use DirectQuery?
Move transformations upstream (SQL/Dataflow)?
Strategies for reducing model size without losing analytical capability
Experiences using Composite Models for this scenario
Would love to hear architectural recommendations from people who handled similar enterprise-scale datasets.
@PowerBI, @MicrosoftPBI
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.