Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
I have a huge Dataflow that saves multiple tables in the Lakehouse. When in dev stage, some get stuck or take too much time to be finished. Whenever this happens, I duplicate the problematic queries and try different and more efficient approaches and try running again to compare results.
When I have to cancel the run, whenever I check the history, it does not show anymore which tables succeeded or were still in progress while it was running, nor even all the ones that were supposed to be processed. It would be great to have this in the history of runs similarly like it has when the dataflow is successfully completed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.