Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello everyone,
we are running several Dataflow Gen2 that write data into a Lakehouse. These dataflows are triggered via refresh rules, either daily or multiple times per day.
We repeatedly encounter the problem that the dataflows fail without any apparent reason and from one day to the next. The issue almost always occurs in the step where the data is written into the Lakehouse.
The error message is:
"Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again. Errorcode: 999999" (This error code 999999 has already been discussed several times in this forum.)
The workaround is to make a harmless modification. Typically, we open the advanced editor in the last step and add a comment. After that, the flow works again.
I think it’s helpful to mention that we are currently using user accounts for authentication. Best practice would probably be a Service Principal, but in our tests we also experienced stability issues with that. And fundamentally, it should also work with user accounts, right?
Has anyone else experienced similar problems? And more importantly: Does anyone have an idea what’s going wrong here and how this issue can be permanently resolved?
Best regards,
Udo
Solved! Go to Solution.
Hi @Udo_S
This issue is a known limitation with Dataflow Gen2 writing to Lakehouse, and the error code 999999 is misleading because the credentials are usually valid - it happens due to token/session refresh problems in the backend. That’s why making a small change in the Advanced Editor temporarily resolves it. The most stable approach is to use a Service Principal instead of user accounts, re-enter the credentials in Manage Connections to force a clean handshake, and avoid overlapping refreshes which can trigger failures. At the moment there isn’t a permanent fix on the customer side, but Microsoft is actively rolling out stability improvements, so until then using a Service Principal and resetting credentials is the best workaround.
Hi @rohit1991 ,
Thanks for your superfast help! We will then try using the Service Principal.
Best Regards,
Udo
Hi @Udo_S
This issue is a known limitation with Dataflow Gen2 writing to Lakehouse, and the error code 999999 is misleading because the credentials are usually valid - it happens due to token/session refresh problems in the backend. That’s why making a small change in the Advanced Editor temporarily resolves it. The most stable approach is to use a Service Principal instead of user accounts, re-enter the credentials in Manage Connections to force a clean handshake, and avoid overlapping refreshes which can trigger failures. At the moment there isn’t a permanent fix on the customer side, but Microsoft is actively rolling out stability improvements, so until then using a Service Principal and resetting credentials is the best workaround.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!