Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi All,
In my current project, we are getting data for Power BI from Snowflake. Currently I have used DSN method using ODBC connector as I have used custom queries and it is extract method. The data size is very huge. In stage environment itself it crossed 23M records now and hence the file size is also getting increased. At the same time, the data refresh is also making us to worry on data availablity. I searched for incremental load and got to know that currently it is not supported for PowerBI and Snowflake combination. Please help me to give a better approach for data refresh if anyone have come across this kind of scenario.
Thanks,
Rajesh S Hegde
@Anonymous , refer if this can help:https://stackoverflow.com/questions/60960462/power-bi-incremental-refresh-with-snowflake
or the Dax append approach
https://blog.crossjoin.co.uk/2020/04/13/keep-the-existing-data-in-your-power-bi-dataset-and-add-new-data-to-it-using-incremental-refresh/
https://www.thebiccountant.com/2017/01/11/incremental-load-in-powerbi-using-dax-union/
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!