We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi everyone,
I’m working with a source system that has a very large number of columns and tables. Due to Power BI model size and performance limitations, I cannot bring all columns into the data model.
Current challenges:
High column count (wide tables)
Large row volume
Import mode hitting memory limits
Performance degradation when loading full schema
I need most of the data for analytical flexibility, but including everything causes model size and refresh issues.
I’d appreciate guidance on:
Best practice for handling very wide source tables
Whether to:
Split fact tables?
Normalize in Power BI?
Use DirectQuery?
Move transformations upstream (SQL/Dataflow)?
Strategies for reducing model size without losing analytical capability
Experiences using Composite Models for this scenario
Would love to hear architectural recommendations from people who handled similar enterprise-scale datasets.
@PowerBI, @MicrosoftPBI
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.