Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
I am writing two sets of data to two different tables in a Lakehouse, one with a copy data action through pipeline, one with a notebook. I want to run a dataflow afterwards to tidy things up and save it to a warehouse. Now I if I just queue everything together in the data pipepline (copy data -> run notebook -> run dataflow) the dataflow is still using the old data before the copy data and notebook actions were applied.
Do I have to put in a delay between making changes to a table and processing the data with a dataflow? Or is there a way to force a refresh
thanks in advance!
Solved! Go to Solution.
Hey @Anonymous & @NandanHegde ,
thank you for your support. Honestly I tried recreating the issue, but I was unable to do so. So I think might have just made a mistake during setup and perceiving it for a bug.
Sure thing! The notebook is using data from the copy data action, that is why I put it behind. The dataflow is using data of both, the copy data and notebook.
I even put in a delay of 5 minutes, but my dataflow was still processing the old data.
Can you check and share the logs here ?
Because in pipeline, the next activity would start only once the copy activity or the notebook is completed.
There is no need for you to even add wait activity
As seen in my sample, yje wait activity started post the completion of the notebook activity.
So based on the start and end time of your logs, we can identify the cause of your issue
Hi @PhilippM
We haven’t heard from you on the last response and was just checking back to see if your query has been resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks
Hey @Anonymous & @NandanHegde ,
thank you for your support. Honestly I tried recreating the issue, but I was unable to do so. So I think might have just made a mistake during setup and perceiving it for a bug.
Can you please share the image of your pipeline flow?
Based on the above configuration/flow, the dataflow would refersh post success / completion of both the prev activities