Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
dgeebs
Regular Visitor

Dataflow Gen2 Fails to refresh - '_WriteToDataDestination' - 'Data source credentials are missing..

I have a Dataflow Gen 2 (CICD) item that unexpectedly stopped refreshing successfully.

The dataflow collects data from several web sources, transforms it, and loads the data into Fabric Lakehouse Tables.


Reviewing the refresh logs, I can see that the refresh failure appears to have been caused by invalid or missing credentials when the dataflow attempts to write data to the Lakehouse tables:

Dataflow refresh statusActionStart timeEnd timeDurationEngineRefresh statusStatus details
FailedXXX_WriteToDataDestination2025-07-16T05:00:19.5439294+00:002025-07-16T05:01:03.8577795+00:000:00:44NAFailedThere was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'. Error code: 999999. (Request ID: xxxxxxx).


I have tried:

- Updating the existing credentials on the existing flow (both editing the existing connection as well as making a new one)

- Re-creating the Dataflow gen2 and copying all of the queries over and recreating the destinations

- Updating the destinations to be completely new destination tables within the Lakehouse

So far each attempt results in the Dataflow successfully previewing and saving in the builder, but failing the refresh with the same error. After changing the destination table, I did notice that the dataflow successfully creates the new tables, but they do not get populated with any data.

Some extra details:
 - I have the Admin role inside of the workspace that contains the Lakehouse and Dataflow Gen2 items

 - I am the Owner of both the destination Lakehouse as well as the Dataflow Gen2

 

Are there any other troubleshooting/diagnostic steps I can try or has anyone else had this issue and been able to resolve it?

1 ACCEPTED SOLUTION
v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

Disabling “Enable staging” was a good tip, but in my case, the real issue came down to broken permissions and metadata bindings after deployment pipeline promotion.

  1. Reapplied Lakehouse permissions manually in the Prod workspace
    → Gave my user Build + Read/Write access again
  2. Rebound the destination table in the Prod-stage Dataflow
    → Removed and re-added the Lakehouse output
  3. Published and refreshed the dataflow  worked successfully

After using pipelines, always verify Lakehouse permissions and rebind Dataflow destinations, even if everything looks fine.

Regards,
Akhil.

View solution in original post

9 REPLIES 9
v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

I hope the response was helpful in resolving your issue. If you have any further questions, please don’t hesitate to let us know we’ll be happy to assist.

Regards,
Akhil.

v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

Great to hear you tracked down the root cause and got it working. Thanks for sharing the detailed steps it’ll definitely help others running into the same issue. Just curious, after reapplying the permissions and rebinding, have things stayed stable across subsequent pipeline promotions?

Regards,
Akhil.

v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

Disabling “Enable staging” was a good tip, but in my case, the real issue came down to broken permissions and metadata bindings after deployment pipeline promotion.

  1. Reapplied Lakehouse permissions manually in the Prod workspace
    → Gave my user Build + Read/Write access again
  2. Rebound the destination table in the Prod-stage Dataflow
    → Removed and re-added the Lakehouse output
  3. Published and refreshed the dataflow  worked successfully

After using pipelines, always verify Lakehouse permissions and rebind Dataflow destinations, even if everything looks fine.

Regards,
Akhil.

Thanks @v-agajavelly for your help and support! Apologies for disappearing for a while, I was finally able to determine what the issue was with my dataflow.

 

In my case, I had a dataflow which used a function to fetch data via a REST API for each row in a source table. Although the preview was succeeding in the PowerQuery UI, after further digging I found that one of the data source API's had become inaccessible due to broken credentials. Once I removed the row from the source table that was failing to collect it's data from it's respective data source API, the dataflow succeeded.

 

What I or fellow readers can take from this is that even if the below error message shows up on the step  XXX_WriteToDataDestination:
There was a problem refreshing the dataflow: 'Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.'.
It may not indicate that the error lies in specifically writing your data to your destination lakehouse - it may be in regards to credential issues with fetching data from an external data source somewhere along the way.

 

Make sure to check every step of your dataflow for errors and you'll likely find where your connection issues lie.

bhavya5903
Advocate II
Advocate II

Hi @dgeebs,

If your destination is a Lakehouse, could you please check if "Enable staging" is checked for the queries in your Dataflow Gen2 CI/CD?

  • If it is enabled, try unchecking it and then re-run the refresh. We've seen similar issues where staging being enabled causes credential-related failures during write operations to the Lakehouse.
  • Even if it's enabled for just one query, it can cause refresh failures with errors related to missing or invalid credentials when writing to the Lakehouse.

bhavya5903_0-1753784648300.png

Kindly uncheck staging for all queries and try refreshing again.

Let me know if that resolves the issue.

v-agajavelly
Community Support
Community Support

Hi @dgeebs ,

I faced the same issue where my Dataflow Gen2 would preview and save correctly, but fail on refresh with a "missing or invalid credentials" error at the WriteToDataDestination step.

Here’s what finally worked for me.

  1. Delete and Recreate the Lakehouse Connection
    Go to Manage connections → Delete the old Lakehouse connection → Recreate it using OAuth (Organizational Account).
  2. Reconfigure the Destination Table in Dataflow
    Remove the current output table, and re-add it with auto-create table enabled or use a new table name.
  3. Verify Lakehouse Permissions
    Make sure your user has Build + Read/Write permissions on the Lakehouse (even if you're the owner).
  4. Click "Publish" After Changes
    After editing connections/destinations, don’t forget to publish the dataflow before refreshing.
  5. Optional: Test with a Small Flow
    Create a minimal test Dataflow writing to a new Lakehouse table to isolate the issue.


Regards,
Akhil.

Still no change. Weirdly enough, a dataflow from another workspace is able to write to the target table in the Lakehouse, but the Dataflow in the same workspace cannot.

 

For additional context, I tried setting up a deployment pipeline and the Dataflow Gen 2 and Lakehouse were both promoted through the Dev, Test, and Prod stages and what I'm finding is that the Dataflow from the Dev stage workspace can write to the table in the prod workspace, but the same dataflow in the prod stage workspace cannot write to the table in the prod workspace.

Is there a chance the Lakehouse permissions got messed up when the Lakehouse item was moved through the deployment pipeline? (I am Admin in all 3 workspaces that are a part of the pipeline)

miguel
Community Admin
Community Admin

Hi!

Is your Dataflow leveraging a Gateway by any chance? if yes, could you share what version of the Gateway you are using?

Hi miguel! No gateway is being used. Thanks!

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Fabric Update Carousel

Fabric Monthly Update - October 2025

Check out the October 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors