Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I recently migrated our Power BI transformations into a SQL view with the expectation that this would reduce the PBIX file size and performance gain. Previously, our Power BI query (with transformations in the Advanced Editor) resulted in a PBIX file size of approximately 24,486 KB. After switching to the SQL view, the PBIX file size increased to 38227 KB.
This outcome is counterintuitive, as offloading transformations to SQL typically reduces file size by minimizing Power BI’s internal transformation logic.
Requests:
Why the PBIX file size increased despite moving transformations to SQL.
Is there any way to cmpare the size of columns
Any recommended diagnostics?
Appreciate your guidance on this. Thank you.
Solved! Go to Solution.
Hello @mjoseph111,
It’s not only about where transformations happen, but also what data Power BI stores in the in-memory model (VertiPaq).
When you switched to a SQL view, you probably changed how much data or how it is typed was sent to Power BI.
More columns returned
Your SQL view may now expose more fields than the Power Query transformations originally kept.
Even if those extra columns aren’t used in visuals, Power BI still imports them (and compresses them).
Less compression due to data types
Power BI’s internal VertiPaq engine is highly optimized for compression — especially on text columns converted to categorical or integer keys.
SQL views often return raw text, decimals, or datetime types that reduce compression efficiency.
Example: converting a country code from integer → varchar(50) can easily double storage.
No query folding or reduced folding efficiency
When transformations were done in Power Query, they might have folded efficiently to SQL.
But if the SQL view now returns a pre-aggregated or denormalized dataset, Power BI has to load more data (fewer filters applied upstream).
Materialized or pre-calculated columns
If the SQL view computes derived columns (like CASE WHENs or string concatenations), those columns often have lower cardinality compression performance.
Data refresh settings
If you disabled Load to model for certain tables before, but re-imported them via the new SQL view, you may now be storing more tables in the PBIX file.
Encoding / locale changes
SQL collation and Power BI text encoding differences can slightly inflate model size, especially with Unicode strings (nvarchar).
Use DAX Studio → View Metrics
Sort by Column size descending to see which fields grew most.
Note the Data type and Distinct count — high cardinality columns are main offenders.
Profile SQL vs Power Query outputs
Run both your old Power Query logic and the SQL view as queries in SSMS or Power BI.
Compare:
Number of columns
Row count
Column data types
Distinct values per column
Check Power Query “Enable load”
Ensure you’re only loading necessary tables into the data model.
Disable “Enable load” for staging or intermediate queries.
Evaluate import vs DirectQuery
If data volume is high, consider using DirectQuery or hybrid tables to avoid fully importing large datasets.
Run the “Manage Columns” audit
Keep only the columns used in visuals, relationships, and measures.
Every unused column adds to VertiPaq storage.
Docs :
- https://daxstudio.org/docs/features/model-metrics/
Hope it can help you!
Best regards,
Antoine
My recommendations are
Optimize SQL View Output – Ensure the SQL view only returns the required columns and filtered rows. Avoid unnecessary joins or calculated fields that increase distinct values and reduce compression.
Check Data Types – Align SQL column data types with Power BI-compatible types (e.g., use INT, NUMERIC, or short text fields). Mismatched or wide data types can inflate PBIX size.
Run VertiPaq Analyzer – Use DAX Studio → View Metrics to identify large tables or high-cardinality columns, then remove or optimize them to reduce model size and improve performance.
This will help you to detemine the actual cause of size increase ?
Hi @mjoseph111,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
@Murtaza_Ghafoor & @AntoineW ,Thanks for your prompt response
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
My recommendations are
Optimize SQL View Output – Ensure the SQL view only returns the required columns and filtered rows. Avoid unnecessary joins or calculated fields that increase distinct values and reduce compression.
Check Data Types – Align SQL column data types with Power BI-compatible types (e.g., use INT, NUMERIC, or short text fields). Mismatched or wide data types can inflate PBIX size.
Run VertiPaq Analyzer – Use DAX Studio → View Metrics to identify large tables or high-cardinality columns, then remove or optimize them to reduce model size and improve performance.
This will help you to detemine the actual cause of size increase ?
Hello @mjoseph111,
It’s not only about where transformations happen, but also what data Power BI stores in the in-memory model (VertiPaq).
When you switched to a SQL view, you probably changed how much data or how it is typed was sent to Power BI.
More columns returned
Your SQL view may now expose more fields than the Power Query transformations originally kept.
Even if those extra columns aren’t used in visuals, Power BI still imports them (and compresses them).
Less compression due to data types
Power BI’s internal VertiPaq engine is highly optimized for compression — especially on text columns converted to categorical or integer keys.
SQL views often return raw text, decimals, or datetime types that reduce compression efficiency.
Example: converting a country code from integer → varchar(50) can easily double storage.
No query folding or reduced folding efficiency
When transformations were done in Power Query, they might have folded efficiently to SQL.
But if the SQL view now returns a pre-aggregated or denormalized dataset, Power BI has to load more data (fewer filters applied upstream).
Materialized or pre-calculated columns
If the SQL view computes derived columns (like CASE WHENs or string concatenations), those columns often have lower cardinality compression performance.
Data refresh settings
If you disabled Load to model for certain tables before, but re-imported them via the new SQL view, you may now be storing more tables in the PBIX file.
Encoding / locale changes
SQL collation and Power BI text encoding differences can slightly inflate model size, especially with Unicode strings (nvarchar).
Use DAX Studio → View Metrics
Sort by Column size descending to see which fields grew most.
Note the Data type and Distinct count — high cardinality columns are main offenders.
Profile SQL vs Power Query outputs
Run both your old Power Query logic and the SQL view as queries in SSMS or Power BI.
Compare:
Number of columns
Row count
Column data types
Distinct values per column
Check Power Query “Enable load”
Ensure you’re only loading necessary tables into the data model.
Disable “Enable load” for staging or intermediate queries.
Evaluate import vs DirectQuery
If data volume is high, consider using DirectQuery or hybrid tables to avoid fully importing large datasets.
Run the “Manage Columns” audit
Keep only the columns used in visuals, relationships, and measures.
Every unused column adds to VertiPaq storage.
Docs :
- https://daxstudio.org/docs/features/model-metrics/
Hope it can help you!
Best regards,
Antoine
Hi @AntoineW, thanks for your response. I’m confident it’s not due to the reasons you mentioned, but I’ll double-check using DAX Studio. Appreciate your help!
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!