The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a Dataflow Gen 2 (CICD) item that unexpectedly stopped refreshing successfully.
The dataflow collects data from several web sources, transforms it, and loads the data into Fabric Lakehouse Tables.
Reviewing the refresh logs, I can see that the refresh failure appears to have been caused by invalid or missing credentials when the dataflow attempts to write data to the Lakehouse tables:
Dataflow refresh status | Action | Start time | End time | Duration | Engine | Refresh status | Status details |
Failed | XXX_WriteToDataDestination | 2025-07-16T05:00:19.5439294+00:00 | 2025-07-16T05:01:03.8577795+00:00 | 0:00:44 | NA | Failed | There was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'. Error code: 999999. (Request ID: xxxxxxx). |
I have tried:
- Updating the existing credentials on the existing flow (both editing the existing connection as well as making a new one)
- Re-creating the Dataflow gen2 and copying all of the queries over and recreating the destinations
- Updating the destinations to be completely new destination tables within the Lakehouse
So far each attempt results in the Dataflow successfully previewing and saving in the builder, but failing the refresh with the same error. After changing the destination table, I did notice that the dataflow successfully creates the new tables, but they do not get populated with any data.
Some extra details:
- I have the Admin role inside of the workspace that contains the Lakehouse and Dataflow Gen2 items
- I am the Owner of both the destination Lakehouse as well as the Dataflow Gen2
Are there any other troubleshooting/diagnostic steps I can try or has anyone else had this issue and been able to resolve it?
Hi @dgeebs,
If your destination is a Lakehouse, could you please check if "Enable staging" is checked for the queries in your Dataflow Gen2 CI/CD?
Kindly uncheck staging for all queries and try refreshing again.
Let me know if that resolves the issue.
Hi @dgeebs ,
I faced the same issue where my Dataflow Gen2 would preview and save correctly, but fail on refresh with a "missing or invalid credentials" error at the WriteToDataDestination step.
Here’s what finally worked for me.
Regards,
Akhil.
Still no change. Weirdly enough, a dataflow from another workspace is able to write to the target table in the Lakehouse, but the Dataflow in the same workspace cannot.
For additional context, I tried setting up a deployment pipeline and the Dataflow Gen 2 and Lakehouse were both promoted through the Dev, Test, and Prod stages and what I'm finding is that the Dataflow from the Dev stage workspace can write to the table in the prod workspace, but the same dataflow in the prod stage workspace cannot write to the table in the prod workspace.
Is there a chance the Lakehouse permissions got messed up when the Lakehouse item was moved through the deployment pipeline? (I am Admin in all 3 workspaces that are a part of the pipeline)
Hi!
Is your Dataflow leveraging a Gateway by any chance? if yes, could you share what version of the Gateway you are using?
Hi miguel! No gateway is being used. Thanks!