March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Problems running Dataflows within Fabric.
I'm trying to follow this step by step.
https://learn.microsoft.com/en-us/power-bi/fundamentals/fabric-get-started
I was able to create this Dataflow and a lakehouse.
I'm having issues when refreshing the Dataflow from MyWorkspace.
null Error: Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again..
When in my Dataflow, and i connect with my organisational account it works, but when i go back a few moments later it is gone again. Any idea or tips?
Also: When creating a Data Pipeline i'm getting an error: Failed to load dataflows. Request failed with status code 401.
Thanks!
Solved! Go to Solution.
We believe this is an issue specific to My Workspace which occurs in either of the following cases:
1) A query that references another query which has Load enabled
2) A query that outputs to a destination and also has Load enabled
We are investigating the issue and will release a fix as quickly as feasible.
In the interim we believe it should be possible to temporarily work around these issues by using a workspace other than My Workspace.
Thanks.
The issues specific to "My Workspace" are now resolved. For on-premises scenarios, you need the latest version of the Gateway (3000.174.13, published on June 2nd).
Thanks.
The issues specific to "My Workspace" are now resolved. For on-premises scenarios, you need the latest version of the Gateway (3000.174.13, published on June 2nd).
Thanks.
Hi SidJay,
Thank you for your help.
I've updated the gateway as requested.
I'll keep you posted should the issue return.
KR.
We believe this is an issue specific to My Workspace which occurs in either of the following cases:
1) A query that references another query which has Load enabled
2) A query that outputs to a destination and also has Load enabled
We are investigating the issue and will release a fix as quickly as feasible.
In the interim we believe it should be possible to temporarily work around these issues by using a workspace other than My Workspace.
Thanks.
I have the same exact issue, but in a workspace that is not My Workspace, so the workaround suggested does not work. Same exact scenario with the organizational logins and the fails are writing due to the token issue described. The pull for our Azure DB works just fine. Are you aware of that this issue is not just in My Workspace?
Also get the same error when pulling data from the Lakehouse with the Gen2 dataflow. I just used copy to successfully load data into the lakehouse I just created and then attempted to use the Lakehouse as the source and got the login error.
Hello @Dave_De,
would it be possible to share a screenshot of the failure you are hitting with Request/Session IDs so we can drill down into the issue?
Thanks,
Frank.
I beleive I may have found the issue. If you change the name of the lakehouse even before you have set it as a desitination in the dataflow, then it will fail, but changing it back to the original name helps it go through. Below are the screenshots:
Same problem, once I used a different workspace it worked fine 🙂
Thanks! When running my testflow in an other workspace it works.
Something additionally which I notice:
It seems like the owner of the resources created by me are (correctly) assigned to my personal account.
But the auto created ones are having an owner called "My Workspace".
Could this cause the authentication issues with the (non-saved) credentials?
Hello @FilipVDR,
thanks for reporting the issue. Would you be able to share additional information for both cases mentioned above:
- are you using a gateway to load data to the lakehouse destination?
- would you be able to share the RequestID/SessionId from the refresh operations in both cases? this will help with troubleshooting. You can find this information in the refresh history page:
Thanks,
Frank.
Hi all,
Same issue here. It seems like the credentials used to setup the data connection are not being saved.
Yes true, when I go back to the dataflow and look for the connection to write to the Datalake, I have to sign in every time I go back.
hi Frank,
- no, even if I make a dataflow and I just enter raw data (one line, one column), that fails
-
Seems i got a similar error: even when I create a new dataflow and enter data with one column and one row, and I want to write it to the lakehouse i get an error:
It seems that the dataflow can't save my connection using my Organizational Account
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.