Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
I have a huge Dataflow that saves multiple tables in the Lakehouse. When in dev stage, some get stuck or take too much time to be finished. Whenever this happens, I duplicate the problematic queries and try different and more efficient approaches and try running again to compare results.
When I have to cancel the run, whenever I check the history, it does not show anymore which tables succeeded or were still in progress while it was running, nor even all the ones that were supposed to be processed. It would be great to have this in the history of runs similarly like it has when the dataflow is successfully completed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.