Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

MS Fabric - Dataflow Gen 2 - Load On-Premise Oracle/MS SQL to Lakehouse error


I tried to load Oracle/MS SQL data through on-premise gateway in Dataflow Gen. 2.. I see tables, load table in editor, see columns, data. Then I wanted to set destination to Lakehouse and always receive this error message:


An exception occurred: The given data source kind is not supported. Data source kind: Lakehouse. 


I tried csv file from SHP - and this was ok, I was able to load data to Lakhouse.

I know - MS Fabric is in preview - but I can not find that this feature (Oracle/MS SQL on-premise) is not yet supported. 



Status: Investigating

Hi  @all,


In order to better investigate your problem, could you provide detailed reproduction steps and information about your version (including the version of the gateway)?


Best regards.
Community Support Team_Caitlyn

Helper II

Yes it is weird. Only our department is testing new fabric feature. Noone was able to load on-prem data, only data from cloud source like SHP Lists, ....

Responsive Resident

Also getting the named pipes error on refresh when trying to publish the dataflow (source is SQL server) directly to a separate lakehouse.  Loading it to the lakehouse that the dataflow automatically creates works but the column names are not populated correctly (it reverts to "Column1", "Column2", etc. instead of the names of the columns in PowerQuery).  I am also able to create a shortcut in a separate lakehouse, but the generic column names can't be adjusted so it is hard to work with.

Community Support

Hi all


Sorry for the late reply. Here is the update of this issue. 



  • Refresh of a Dataflow Gen 2 with output destination failing via gateway. Gateway refreshes in Dataflows Gen2 writes directly to destination from the Gateway (LakeHouse, DataLake, etc.).
  • ​Dataflows through a Gateway may fail to reach DataLake, LakeHouse, or other destination if the gateway is not configured to allow connections directly to that destination.


Solutions and Workarounds

There is not going to be a fix for this, this is by design, please refer to below details to overcome the situation:

  • The gateway must be configured to be able to pass through the firewall or proxy to reach destination data source. If the user is using a proxy server, this may require enable-listing URLs to appropriate destinations. i.e. (* for LakeHouse, (* for DataLake, etc.
  • Users using LakeHouse destinations must be running at least the May 2023 release of the gateway (the connector is not available in gateways prior to this release).
  • Public doc updates for this are now live - .


Best Regards,
Community Support Team _ Jing

New Member

Hi @DavidPi 


Were you able to solve this issue? We have run into this issue as well. We are on the latest version of the gateway (3000.182.5), are not using a proxy, and the network test within the on-premise gateway indicates that we can reach all necessary ports. 

Are there any other troubleshooting steps?


Advocate II

Nah no luck - I just attempted it again and could not connect to an On Prem sql server within Data Lake to ingest data.

Helper II



I am step further but not finished yet. 

I had to permit this address on FW from our data gateway server (* a * - see


Now I receive new error - that my gateway is old, so I asked IT OPS to upgrade our GTW and waiting for that. I hope this will help.



Helper II

We have upgraded our GTW and now receive this error message. 

I really do not know how to solve this.