Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
dgeebs
New Member

Dataflow Gen2 Fails to refresh - '_WriteToDataDestination' - 'Data source credentials are missing..

I have a Dataflow Gen 2 (CICD) item that unexpectedly stopped refreshing successfully.

The dataflow collects data from several web sources, transforms it, and loads the data into Fabric Lakehouse Tables.


Reviewing the refresh logs, I can see that the refresh failure appears to have been caused by invalid or missing credentials when the dataflow attempts to write data to the Lakehouse tables:

Dataflow refresh statusActionStart timeEnd timeDurationEngineRefresh statusStatus details
FailedXXX_WriteToDataDestination2025-07-16T05:00:19.5439294+00:002025-07-16T05:01:03.8577795+00:000:00:44NAFailedThere was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'. Error code: 999999. (Request ID: xxxxxxx).


I have tried:

- Updating the existing credentials on the existing flow (both editing the existing connection as well as making a new one)

- Re-creating the Dataflow gen2 and copying all of the queries over and recreating the destinations

- Updating the destinations to be completely new destination tables within the Lakehouse

So far each attempt results in the Dataflow successfully previewing and saving in the builder, but failing the refresh with the same error. After changing the destination table, I did notice that the dataflow successfully creates the new tables, but they do not get populated with any data.

Some extra details:
 - I have the Admin role inside of the workspace that contains the Lakehouse and Dataflow Gen2 items

 - I am the Owner of both the destination Lakehouse as well as the Dataflow Gen2

 

Are there any other troubleshooting/diagnostic steps I can try or has anyone else had this issue and been able to resolve it?

5 REPLIES 5
bhavya5903
Advocate I
Advocate I

Hi @dgeebs,

If your destination is a Lakehouse, could you please check if "Enable staging" is checked for the queries in your Dataflow Gen2 CI/CD?

  • If it is enabled, try unchecking it and then re-run the refresh. We've seen similar issues where staging being enabled causes credential-related failures during write operations to the Lakehouse.
  • Even if it's enabled for just one query, it can cause refresh failures with errors related to missing or invalid credentials when writing to the Lakehouse.

bhavya5903_0-1753784648300.png

Kindly uncheck staging for all queries and try refreshing again.

Let me know if that resolves the issue.

v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

I faced the same issue where my Dataflow Gen2 would preview and save correctly, but fail on refresh with a "missing or invalid credentials" error at the WriteToDataDestination step.

Here’s what finally worked for me.

  1. Delete and Recreate the Lakehouse Connection
    Go to Manage connections → Delete the old Lakehouse connection → Recreate it using OAuth (Organizational Account).
  2. Reconfigure the Destination Table in Dataflow
    Remove the current output table, and re-add it with auto-create table enabled or use a new table name.
  3. Verify Lakehouse Permissions
    Make sure your user has Build + Read/Write permissions on the Lakehouse (even if you're the owner).
  4. Click "Publish" After Changes
    After editing connections/destinations, don’t forget to publish the dataflow before refreshing.
  5. Optional: Test with a Small Flow
    Create a minimal test Dataflow writing to a new Lakehouse table to isolate the issue.


Regards,
Akhil.

Still no change. Weirdly enough, a dataflow from another workspace is able to write to the target table in the Lakehouse, but the Dataflow in the same workspace cannot.

 

For additional context, I tried setting up a deployment pipeline and the Dataflow Gen 2 and Lakehouse were both promoted through the Dev, Test, and Prod stages and what I'm finding is that the Dataflow from the Dev stage workspace can write to the table in the prod workspace, but the same dataflow in the prod stage workspace cannot write to the table in the prod workspace.

Is there a chance the Lakehouse permissions got messed up when the Lakehouse item was moved through the deployment pipeline? (I am Admin in all 3 workspaces that are a part of the pipeline)

miguel
Community Admin
Community Admin

Hi!

Is your Dataflow leveraging a Gateway by any chance? if yes, could you share what version of the Gateway you are using?

Hi miguel! No gateway is being used. Thanks!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors