This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreLevel up your Power BI skills this month - build one visual each week and tell better stories with data! Get started
Hi everyone,
I’m working with a source system that has a very large number of columns and tables. Due to Power BI model size and performance limitations, I cannot bring all columns into the data model.
Current challenges:
High column count (wide tables)
Large row volume
Import mode hitting memory limits
Performance degradation when loading full schema
I need most of the data for analytical flexibility, but including everything causes model size and refresh issues.
I’d appreciate guidance on:
Best practice for handling very wide source tables
Whether to:
Split fact tables?
Normalize in Power BI?
Use DirectQuery?
Move transformations upstream (SQL/Dataflow)?
Strategies for reducing model size without losing analytical capability
Experiences using Composite Models for this scenario
Would love to hear architectural recommendations from people who handled similar enterprise-scale datasets.
@PowerBI, @MicrosoftPBI
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.