Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have a Dataflow Gen 2 (CICD) item that unexpectedly stopped refreshing successfully.
The dataflow collects data from several web sources, transforms it, and loads the data into Fabric Lakehouse Tables.
Reviewing the refresh logs, I can see that the refresh failure appears to have been caused by invalid or missing credentials when the dataflow attempts to write data to the Lakehouse tables:
| Dataflow refresh status | Action | Start time | End time | Duration | Engine | Refresh status | Status details |
| Failed | XXX_WriteToDataDestination | 2025-07-16T05:00:19.5439294+00:00 | 2025-07-16T05:01:03.8577795+00:00 | 0:00:44 | NA | Failed | There was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'. Error code: 999999. (Request ID: xxxxxxx). |
I have tried:
- Updating the existing credentials on the existing flow (both editing the existing connection as well as making a new one)
- Re-creating the Dataflow gen2 and copying all of the queries over and recreating the destinations
- Updating the destinations to be completely new destination tables within the Lakehouse
So far each attempt results in the Dataflow successfully previewing and saving in the builder, but failing the refresh with the same error. After changing the destination table, I did notice that the dataflow successfully creates the new tables, but they do not get populated with any data.
Some extra details:
- I have the Admin role inside of the workspace that contains the Lakehouse and Dataflow Gen2 items
- I am the Owner of both the destination Lakehouse as well as the Dataflow Gen2
Are there any other troubleshooting/diagnostic steps I can try or has anyone else had this issue and been able to resolve it?
Solved! Go to Solution.
Hi @dgeebs ,
Disabling “Enable staging” was a good tip, but in my case, the real issue came down to broken permissions and metadata bindings after deployment pipeline promotion.
After using pipelines, always verify Lakehouse permissions and rebind Dataflow destinations, even if everything looks fine.
Regards,
Akhil.
Hi @dgeebs ,
I hope the response was helpful in resolving your issue. If you have any further questions, please don’t hesitate to let us know we’ll be happy to assist.
Regards,
Akhil.
Hi @dgeebs ,
Great to hear you tracked down the root cause and got it working. Thanks for sharing the detailed steps it’ll definitely help others running into the same issue. Just curious, after reapplying the permissions and rebinding, have things stayed stable across subsequent pipeline promotions?
Regards,
Akhil.
Hi @dgeebs ,
Disabling “Enable staging” was a good tip, but in my case, the real issue came down to broken permissions and metadata bindings after deployment pipeline promotion.
After using pipelines, always verify Lakehouse permissions and rebind Dataflow destinations, even if everything looks fine.
Regards,
Akhil.
Thanks @v-agajavelly for your help and support! Apologies for disappearing for a while, I was finally able to determine what the issue was with my dataflow.
In my case, I had a dataflow which used a function to fetch data via a REST API for each row in a source table. Although the preview was succeeding in the PowerQuery UI, after further digging I found that one of the data source API's had become inaccessible due to broken credentials. Once I removed the row from the source table that was failing to collect it's data from it's respective data source API, the dataflow succeeded.
What I or fellow readers can take from this is that even if the below error message shows up on the step XXX_WriteToDataDestination:
There was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'.
It may not indicate that the error lies in specifically writing your data to your destination lakehouse - it may be in regards to credential issues with fetching data from an external data source somewhere along the way.
Make sure to check every step of your dataflow for errors and you'll likely find where your connection issues lie.
Hi @dgeebs,
If your destination is a Lakehouse, could you please check if "Enable staging" is checked for the queries in your Dataflow Gen2 CI/CD?
Kindly uncheck staging for all queries and try refreshing again.
Let me know if that resolves the issue.
Hi @dgeebs ,
I faced the same issue where my Dataflow Gen2 would preview and save correctly, but fail on refresh with a "missing or invalid credentials" error at the WriteToDataDestination step.
Here’s what finally worked for me.
Regards,
Akhil.
Still no change. Weirdly enough, a dataflow from another workspace is able to write to the target table in the Lakehouse, but the Dataflow in the same workspace cannot.
For additional context, I tried setting up a deployment pipeline and the Dataflow Gen 2 and Lakehouse were both promoted through the Dev, Test, and Prod stages and what I'm finding is that the Dataflow from the Dev stage workspace can write to the table in the prod workspace, but the same dataflow in the prod stage workspace cannot write to the table in the prod workspace.
Is there a chance the Lakehouse permissions got messed up when the Lakehouse item was moved through the deployment pipeline? (I am Admin in all 3 workspaces that are a part of the pipeline)
Hi!
Is your Dataflow leveraging a Gateway by any chance? if yes, could you share what version of the Gateway you are using?
Hi miguel! No gateway is being used. Thanks!
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.