Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
Are there recommended Power BI settings for optimizing refresh performance when importing large datasets from Snowflake?
Solved! Go to Solution.
Hi @manoj_0911 If your question is about improving the refresh performance in general, then you can check the below .
Thanks
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Try to keep Power Query as simple as possible and let Snowflake do the heavy work.
Good approach:
Use views or optimized SQL queries in Snowflake
Avoid complex transformations in Power Query
For large tables, this is usually the biggest performance improvement.
Configure incremental refresh with parameters such as: "RangeStart", "RangeEnd"
Power BI should push filters and transformations back to Snowflake. In Power Query: "Right click step → View Native Query". If folding is broken early, Power BI may pull large datasets locally before processing.
Only import what you actually need:
remove unused columns
filter historical data
avoid importing high-cardinality text columns when unnecessary
Reducing data size improves both refresh time and model compression.
Large models with many relationships can slow processing.
prefer star schema
avoid complex many-to-many relationships
use surrogate keys when possible
Hi @manoj_0911 , Hope you are doing well. Kindly let us know if the issue has been resolved or if further assistance is needed. Your input could be helpful to others in the community.
Hi @manoj_0911,
Thank you for posting your query in the Microsoft Fabric Community Forum, and thanks to @cengizhanarslan & @Natarajan_M for sharing valuable insights.
Could you please confirm if your query has been resolved by the provided solutions? This would be helpful for other members who may encounter similar issues.
Thank you for being part of the Microsoft Fabric Community.
Try to keep Power Query as simple as possible and let Snowflake do the heavy work.
Good approach:
Use views or optimized SQL queries in Snowflake
Avoid complex transformations in Power Query
For large tables, this is usually the biggest performance improvement.
Configure incremental refresh with parameters such as: "RangeStart", "RangeEnd"
Power BI should push filters and transformations back to Snowflake. In Power Query: "Right click step → View Native Query". If folding is broken early, Power BI may pull large datasets locally before processing.
Only import what you actually need:
remove unused columns
filter historical data
avoid importing high-cardinality text columns when unnecessary
Reducing data size improves both refresh time and model compression.
Large models with many relationships can slow processing.
prefer star schema
avoid complex many-to-many relationships
use surrogate keys when possible
Hi @manoj_0911 If your question is about improving the refresh performance in general, then you can check the below .
Thanks
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 43 | |
| 35 | |
| 35 | |
| 21 | |
| 15 |
| User | Count |
|---|---|
| 65 | |
| 58 | |
| 28 | |
| 27 | |
| 25 |