Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hi,
I am setting up an extraction step in my medallion-arcitechure in Fabric where I want to get data from an on-premise SQL Server. I have installed the on premise datagateway (latest release) and I can find the table I want to extract when I set up a new Data flow Gen 2 artifact. I choose my lakehouse (both de destination and data flow gen 2 artifact exsist in the same workspace) and publish the Dataflow.
All the staging artifacts are created (DataflowStagingLakehouse, SQL-endpoint etc.) and the dataflow is published corectly. However, when running the dataflow I get an error saying that the activity "WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom_****_*******" fails (** is for my table name). The only error message I get is this:
"GatewayCannotAccessSqlError Couldn't refresh the entity because of an internal error"
*text in swedish
The data is stored in the DataFlowStagingLakehouse but is never able to arrive in the designated lakehouse. If i remove the destination from the dataflow, the job works fine without the errors.
What I have done so far:
Is thera anyone who has any idea what might be the issue?
Solved! Go to Solution.
We have now managed to get the data in to the lakehouse. Greate success!
The information that I got from our hosting partner was that traffic going to pbipneu5-northeurope.pbidedicated.windows.net. was blocked. When this blockade was lifted, it started working.
New rule in firewall:
Protocol: TCP
Endpoint: pbipneu5-northeurope.pbidedicated.windows.net
Port: 1433
pbipneu5-northeurope depends on what region you have chosen.
Hope this helps.
//Mikael
We have now managed to get the data in to the lakehouse. Greate success!
The information that I got from our hosting partner was that traffic going to pbipneu5-northeurope.pbidedicated.windows.net. was blocked. When this blockade was lifted, it started working.
New rule in firewall:
Protocol: TCP
Endpoint: pbipneu5-northeurope.pbidedicated.windows.net
Port: 1433
pbipneu5-northeurope depends on what region you have chosen.
Hope this helps.
//Mikael
Hi @MikaelDavidsson ,
I did exactly same thing few weeks back and I can confirm that the approach I followed on getting on-premises SQL Server data to Lakehouse using Dataflow Gen 2 works. I tried it again few minutes back by bringing and writing a new table to Lakehouse.
I am not sure where things are gone wrong for you.
I wrote a blog about the whole process and configurations I followed to set this process up at my end.
Worth checking this one once. May be you can identify some difference:
https://datavibe.co.uk/getting-on-premises-sql-server-data-to-fabric-lakehouse/
Hi, did you ever get to the bottom of this. I've checked all the things that you mentioned you'd already done and I'm out of ideas! Thanks
I'm not entirely sure we can reach a conclusion with the information provided, but definitely using the latest gateway and making sure that its not a network firewall issue are on top of my mind for things to consider given the information.
I'd highly recommend reaching out to our support team so they can take a closer look at your scenario and try to troubleshoot.
I get the same message, although I want to write to a WH.
The GW connection is working for Power BI
Data is present in the staging lakehouse
I've tried to wrtite data both with and without staging.
We use Finnish_Swedish_CI_AS as collation and I've tried using both English locale and Swedish locale for the DF.
User | Count |
---|---|
1 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
9 | |
7 | |
3 | |
3 | |
3 |