Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi All,
In my current project, we are getting data for Power BI from Snowflake. Currently I have used DSN method using ODBC connector as I have used custom queries and it is extract method. The data size is very huge. In stage environment itself it crossed 23M records now and hence the file size is also getting increased. At the same time, the data refresh is also making us to worry on data availablity. I searched for incremental load and got to know that currently it is not supported for PowerBI and Snowflake combination. Please help me to give a better approach for data refresh if anyone have come across this kind of scenario.
Thanks,
Rajesh S Hegde
@hegdecisco86 , refer if this can help:https://stackoverflow.com/questions/60960462/power-bi-incremental-refresh-with-snowflake
or the Dax append approach
https://blog.crossjoin.co.uk/2020/04/13/keep-the-existing-data-in-your-power-bi-dataset-and-add-new-...
https://www.thebiccountant.com/2017/01/11/incremental-load-in-powerbi-using-dax-union/
User | Count |
---|---|
85 | |
76 | |
72 | |
69 | |
56 |
User | Count |
---|---|
104 | |
99 | |
92 | |
78 | |
69 |