Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
I have a huge Dataflow that saves multiple tables in the Lakehouse. When in dev stage, some get stuck or take too much time to be finished. Whenever this happens, I duplicate the problematic queries and try different and more efficient approaches and try running again to compare results.
When I have to cancel the run, whenever I check the history, it does not show anymore which tables succeeded or were still in progress while it was running, nor even all the ones that were supposed to be processed. It would be great to have this in the history of runs similarly like it has when the dataflow is successfully completed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.