Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
We've noticed issues ingesting data into both Lakehouse and Warehouse Fabric artifacts using Dataflow gen2. We consistently get the following error during the 'write data to destination' activity:
104100 Couldn't refresh the entity because of an internal error
This happens with both SSMI connectors using organizational authentication, and Postgres SQL connectors using Basic authentication.
The gateway is currently in version 3000.202.16, which we upgraded from the July 23 version in hopes of remedying the issue. Just to clarify, we had the same issue before and after upgrading the gateway version.
We have found that the VNET gateways work, however we would ideally like to use the on premise gateway due to us already having them. Additionally, we aren't able to use organizational authentication with the VNET gateway, due to the following known issue with VNET gateways:
VNet data gateways don't support conditional access policies. When conditional access policies are enabled, Power BI shows a "DM_GWPipeline_Client_OAuthTokenLoginFailedError" error when you try to update credentials using the OAuth authentication type.
Any suggestions are greatly appreciated.
Solved! Go to Solution.
Hi @khaskett
Thanks for using Microsoft Fabric Community.
Apologies for the inconvenience that you are facing here.
When using Microsoft Fabric Dataflow Gen2 with an on-premises data gateway, you might encounter issues with the dataflow refresh process. The underlying problem occurs when the gateway is unable to connect to the dataflow staging Lakehouse in order to read the data before copying it to the desired data destination. This issue can occur regardless of the type of data destination being used.
Please refer to the documentation for additional information.
I hope this information helps. Please do let us know if you have any further questions.
Thanks.
Hi @khaskett
Thanks for using Microsoft Fabric Community.
Apologies for the inconvenience that you are facing here.
When using Microsoft Fabric Dataflow Gen2 with an on-premises data gateway, you might encounter issues with the dataflow refresh process. The underlying problem occurs when the gateway is unable to connect to the dataflow staging Lakehouse in order to read the data before copying it to the desired data destination. This issue can occur regardless of the type of data destination being used.
Please refer to the documentation for additional information.
I hope this information helps. Please do let us know if you have any further questions.
Thanks.
Thanks for your reply. The explanation makes sense with the connection between the staging lakehouse and the gateway.
We've added a firewall rule to allow the connection and now everything works as hoped.
Thank you!
Hi @khaskett
Glad that your query got resolved.
Please continue using Fabric Community for any help regarding your queries.
Hi
we're facing similar issues, but what is bizzar is, that the data is actually loaded into the staging lakehouse. So somehow, the gateway can reach it, but as the data flow tries, to write from staging to destination lake house it doesnt work. And the error messages are not helpfull at all...
KR
User | Count |
---|---|
3 | |
2 | |
2 | |
2 | |
2 |