Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Reply
TK12345
Resolver II
Resolver II

Copy Data with VNET gateway error

Hi all, 

 

I have an issue with my SQL Server / Azure SQL Database extraction. As you can see I connected to my source using a VNET gateway, and I do see the preview data. This table contains 37 rows but when I try to run the copy activity in my pipeline or a simple copy job, is do get the error:

ErrorCode=LakehouseOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Lakehouse operation failed for: Operation returned an invalid status code 'InternalServerError'. Workspace: 'xxxxx'. Path: 'xxxxx/Tables/dbo'. Message: 'Internal Server Error'. TimeStamp: 'Thu, 07 May 2026 06:39:40 GMT'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'InternalServerError',Source=Microsoft.DataTransfer.ClientLibrary,'

 

I've tried different lakehouses as destenation, new pipeline, run with query, write as parquet, all the simple stuff, but I still got this message. Is there something I don't get?

Hope someone could help me out.

 

TK12345_0-1778136318278.png

 

5 REPLIES 5
arabalca
Solution Supplier
Solution Supplier

Hi @TK12345 ,

 

Before continuing the investigation, we need to clarify a couple of things.

Are you using Azure SQL Database / Azure SQL Managed Instance in the cloud, or is it a SQL Server on a VM or on-premises environment? Depending on this, the diagnosis changes:

  • If it is on-premises, first verify that the gateway is active and working correctly, as a gateway with issues can allow preview to work but fail when running the full pipeline execution.
  • If it is Azure cloud, although using the VNET is valid and adds an extra security layer, try creating the same Copy Data activity using a direct connection without the VNET. If the pipeline works correctly with that direct connection, the problem is confirmed to be in the gateway and not in the source or destination.

In any case, verify that the connection is correctly configured and in active status from the Connections section in Fabric before running the pipeline again.

This is the quickest way to identify where the error is really coming from.

Hope this helps.

 

If my comments helped solve your question, it would be great if you could like the two comments and mark it as the accepted solution. It helps others with the same issue and also motivates me to keep contributing.

 

Thanks a lot, I really appreciate it.

v-aatheeque
Community Support
Community Support

Hi @TK12345 
Thanks for reaching out to fabric community forum.

From the screenshot, we can confirm that the source-side connectivity appears to be working correctly because the preview data is loading successfully from the SQL source.

This helps narrow the issue down to the Lakehouse destination/write operation rather than the SQL extraction itself.

Could you please also share a screenshot of the Destination tab configuration from the same Copy Activity?

Additionally, could you help confirm the following:

  • Which Lakehouse is selected as the destination, and how is the connection configured?
  • Are there multiple pipelines or copy activities running in parallel and writing into the same Lakehouse/path simultaneously, especially creating Parquet files?
  • Are you writing into the Tables section or the Files section of the Lakehouse?

These details will help determine whether the issue is related to destination configuration, concurrent write operations or backend behavior.

Hi, 

Thanks for your reply. For the destination I will share the screenshot. FYI, I've tested with a new lakehouse, our normal lakehouse connection, delta, parquet, new schema, dbo, files parquet. Besides that, it is also just a duplicate of another pipeline that is just working fine with the exact same logic. There are no multiple pipelines running, just 1, with 1 table. 

Preferably I would write this data into tables. But checked all the options, all with same error message.




TK12345_0-1778152141103.png

 

Hi @TK12345 

Thank you for the clarification.
Since another pipeline with the same logic is working successfully, this helps narrow the issue specifically to this pipeline/activity configuration or the source table metadata rather than the VNET gateway or Lakehouse itself.

Could you please try the following:
Create a new Copy Activity in the same pipeline and configure the source/destination manually instead of duplicating the existing activity.
Test with a minimal query:
Then test with:
SELECT TOP 10 *
FROM YourTable

If schema mapping is enabled, please remove and re-import the schema mapping.
Also verify whether the source table contains unsupported datatypes or special characters in column names.
This will help determine whether the issue is related to the specific pipeline configuration or the source table metadata.

Hi v-aatheeque,
Thanks for helping. 

I've tested your suggestions again (tested it before myself), still the same issue. Created a new pipeline just from the UI, new connection, new lakehouse. Just a select limit 10. Still preview data to see, but same error with target. 


Helpful resources

Announcements
April Fabric Update Carousel

Fabric Monthly Update - April 2026

Check out the April 2026 Fabric update to learn about new features.

Fabric SQL PBI Data Days

Data Days 2026 coming soon!

Sign up to receive a private message when registration opens and key events begin.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.