Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Udo_S
Helper I
Helper I

Stability Issues when Writing to Lakehouse via Dataflow Gen2 (Invalid creds ,Errorcode 999999)

Hello everyone,


we are running several Dataflow Gen2 that write data into a Lakehouse. These dataflows are triggered via refresh rules, either daily or multiple times per day.


We repeatedly encounter the problem that the dataflows fail without any apparent reason and from one day to the next. The issue almost always occurs in the step where the data is written into the Lakehouse.


The error message is:
"Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again. Errorcode: 999999" (This error code 999999 has already been discussed several times in this forum.)


The workaround is to make a harmless modification. Typically, we open the advanced editor in the last step and add a comment. After that, the flow works again.


I think it’s helpful to mention that we are currently using user accounts for authentication. Best practice would probably be a Service Principal, but in our tests we also experienced stability issues with that. And fundamentally, it should also work with user accounts, right?


Has anyone else experienced similar problems? And more importantly: Does anyone have an idea what’s going wrong here and how this issue can be permanently resolved?


Best regards,
Udo

1 ACCEPTED SOLUTION
rohit1991
Super User
Super User

Hi @Udo_S 

 

This issue is a known limitation with Dataflow Gen2 writing to Lakehouse, and the error code 999999 is misleading because the credentials are usually valid - it happens due to token/session refresh problems in the backend. That’s why making a small change in the Advanced Editor temporarily resolves it. The most stable approach is to use a Service Principal instead of user accounts, re-enter the credentials in Manage Connections to force a clean handshake, and avoid overlapping refreshes which can trigger failures. At the moment there isn’t a permanent fix on the customer side, but Microsoft is actively rolling out stability improvements, so until then using a Service Principal and resetting credentials is the best workaround.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

View solution in original post

2 REPLIES 2
Udo_S
Helper I
Helper I

Hi @rohit1991 ,
Thanks for your superfast help! We will then try using the Service Principal.
Best Regards,
Udo

rohit1991
Super User
Super User

Hi @Udo_S 

 

This issue is a known limitation with Dataflow Gen2 writing to Lakehouse, and the error code 999999 is misleading because the credentials are usually valid - it happens due to token/session refresh problems in the backend. That’s why making a small change in the Advanced Editor temporarily resolves it. The most stable approach is to use a Service Principal instead of user accounts, re-enter the credentials in Manage Connections to force a clean handshake, and avoid overlapping refreshes which can trigger failures. At the moment there isn’t a permanent fix on the customer side, but Microsoft is actively rolling out stability improvements, so until then using a Service Principal and resetting credentials is the best workaround.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.