Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
hi all!
I have a problem with the refresh,
I use pro licenses and I use an incremental refresh- because I got to the gateway limit
when I want to change my data set ( add another column etc.) I upload a new version from the desktop and it refreshes all the data ( and when I upload a new version I don't see any data for all my reports until it's finishing the refresh).
the last time I got a timeout error because it took more than 2 hours.
there is a way t publish the dataset and keep all the data that exist? ( not do a refresh and take all the data from start)
thanks
Hey @sapirmarko ,
the answer to you your question unfortunately is simple - no you can't!
Adding a column to a table with partitions (partitions are created by the incremental refresh) invalidates existing partitions. After adding a column an "initial" load has to be performed.
The only way I can think of is to create a view inside your data source (of course this requires a data source that is supporting views), instead of accessing the table use the view, then you can adapt the view step by step until all the data has been loaded after a schema change.
Hopefully, this provides an idea on how to tackle your challenge.
Regards,
Tom