Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
I have a pipeline in my dev-processing workspace which uses a Dataflow to load data from my bronze to silver lakehouse.
The pipeline fails at the dataflow with the following error:
Refresh Dataflow failed with status: NotFound, Failure reason:
The dataflow is parametrised and I have checked in the pipeline run history to see that the correct parameters are being fed into the dataflow. There error makes it sound like the dataflow doesn't exist but it definitely does exist in the correct workspace.
I came across a similar error when de[ploying pipelines where you had to open each dataflow to publish it before it would work. I tried this (opened, added full-stop to description and then saved) but it didn't fix the issue.
Any ideas on why this error occurs? I can't find any documentation on it.
Solved! Go to Solution.
Hi @samkikibaker ,
We’ve hit this exact “Refresh Dataflow failed with status: NotFound” error before, and even though the message suggests the dataflow doesn’t exist, the problem was actually somewhere else. In our case, the pipeline was calling the dataflow before the dataflow reference was fully published or synced inside the workspace. Even though the dataflow existed visually, Fabric sometimes loses the internal reference after edits, renames, or when parameters are added.
What finally fixed it for us was simply opening the dataflow, letting it load completely, and then hitting save/publish again — but the key difference was that we did this after switching into the exact environment the pipeline runs in. Once the dataflow was re-published inside that workspace context, the pipeline was able to find it again.
Another thing we noticed is that if the dataflow is parametrized, even a small mismatch like a space, casing difference, or an older cached version can cause Fabric to say “NotFound”, even if the name is correct. When we re-published the dataflow and then re-opened the pipeline once, the pipeline started picking up the correct reference.
It’s definitely not a permissions issue or a location issue — it’s more of a “Fabric hasn’t refreshed the internal link to this dataflow” type of scenario. After re-saving the dataflow and letting the workspace update, our pipeline refresh calls worked fine again.
If you’ve already opened and saved it once, try doing it one more time, but make sure you’re in the exact workspace/environment the pipeline runs in. That usually forces Fabric to rebuild the reference and clears the NotFound error.
Hope this helps — happy to share more if you want to compare setups.
Gopi Krishna
Hi @samkikibaker ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @jisaac , @Ugk161610 , Thank you for your prompt responses.
Hi @samkikibaker , Could you please try the proposed solutions shared by @jisaac and @Ugk161610 ? Let us know if you’re still facing the same issue we’ll be happy to assist you further.
Regards,
Dinesh
Hi @samkikibaker ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Regards,
Dinesh
Hi @samkikibaker ,
We’ve hit this exact “Refresh Dataflow failed with status: NotFound” error before, and even though the message suggests the dataflow doesn’t exist, the problem was actually somewhere else. In our case, the pipeline was calling the dataflow before the dataflow reference was fully published or synced inside the workspace. Even though the dataflow existed visually, Fabric sometimes loses the internal reference after edits, renames, or when parameters are added.
What finally fixed it for us was simply opening the dataflow, letting it load completely, and then hitting save/publish again — but the key difference was that we did this after switching into the exact environment the pipeline runs in. Once the dataflow was re-published inside that workspace context, the pipeline was able to find it again.
Another thing we noticed is that if the dataflow is parametrized, even a small mismatch like a space, casing difference, or an older cached version can cause Fabric to say “NotFound”, even if the name is correct. When we re-published the dataflow and then re-opened the pipeline once, the pipeline started picking up the correct reference.
It’s definitely not a permissions issue or a location issue — it’s more of a “Fabric hasn’t refreshed the internal link to this dataflow” type of scenario. After re-saving the dataflow and letting the workspace update, our pipeline refresh calls worked fine again.
If you’ve already opened and saved it once, try doing it one more time, but make sure you’re in the exact workspace/environment the pipeline runs in. That usually forces Fabric to rebuild the reference and clears the NotFound error.
Hope this helps — happy to share more if you want to compare setups.
Gopi Krishna
@samkikibaker I had the same issue recently but with a Notebook instead of a Dataflow. Question: does the Dataflow successfully when you run it by itself and not through the Pipeline? For me, the Notebook still failed even when I tried to run it manually. This is probably not a solution, but the only way I was able to fix it after an hour or so was to copy everything over to a new Notebook, delete the old one, and replace it with the new one in the Pipeline schedule. Try recreating your Dataflow and see if the old one is just in some weird corrupted state. (Hint: use the advanced editor to copy all the steps at once.)