Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Element115
Super User
Super User

QUESTION::COPY DATA::UPSERT FAIL

The source is an on-prem SQL Server, and the destination is a lakehouse table.  I use a Copy data activity pointing to the lakehouse SQL endpoint in order to get access to the Upsert option.

 

Can anyone explain what this error message means?  It is emitted by a Copy data activity issues a simple SELECT with a WHERE clause to an on-prem SQL Server:

 

 

ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please search error to get more details.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Global temporary tables and transient user tables in TEMPDB are not supported in this edition of SQL Server.,Source=.Net SqlClient Data Provider,SqlErrorNumber=24581,Class=16,ErrorCode=-2146232060,State=2,Errors=[{Class=16,Number=24581,State=2,Message=Global temporary tables and transient user tables in TEMPDB are not supported in this edition of SQL Server.,},],'

 

 

In the Destination tab of Copy activity I use these settings, to have Copy data activity perform an upsert on those rows in the lakehouse that have changed:

 

Screenshot 2025-07-25 150941.jpg

 

The sink is AzureSqlDatabase.

 

The source is SQLServer: Microsoft SQL Server 2017 (RTM-CU31-GDR) (KB5021126) - 14.0.3460.9 (X64)
Jan 25 2023 08:42:43
Copyright (C) 2017 Microsoft Corporation
Standard Edition (64-bit) on Windows Server 2016 Standard 10.0 <X64> (Build 14393: ) (Hypervisor)
EngineEditon: 2

11 REPLIES 11
supreethv
Microsoft Employee
Microsoft Employee

You will not be able to use the SQL Server endpoint of the LH as the Copy activity destination, as that endpoint is read-only. Please refer to the doc T-SQL surface area - Microsoft Fabric | Microsoft Learn

I always get confused between the SQL endpoint and the lakehouse in the dropdown list and assume that if a SQL endpoint is provided, then it means that under the hood all SQL operations are emulated on delta tables so that the user can manipulate delta tables like traditional DB tables with the expectation of achieveing the same result as if working with a SQL DB (both for DDL and DML).

 

That does not seem to be the case, which is a pity, especially since there is a Python API for Databricks and DeltaTable, one would think that under the hood a translation layer could be implemented to take as input any SQL DDL or DML and translate that to the equivalent Databricks/DeltaTable instruction set.

Amruthafragomen
New Member

I'm facing the same issue, I tried both  lakehouse and warehouse  , Both returned same error. I  suspect its  the  issue  Azure sql db on warehouse. Will  enabling  staging help ?

I got a response finally.  Even though UPSERT is a functionality that's exposed in the GUI, it's a UI bug because UPSERT is not available with a lakehouse when using the SQL endpoint.  So you have to build it manually using a Copy data activity and a SQL SELECT statement with a WHERE clause to filter for only the newest rows.

No.  I have a support ticket open since one month ago.  The product team is still debugging and hasn't come back with a fix yet.

Element115
Super User
Super User

Ticket #2407230040010740

frithjof_v
Super User
Super User

My guess is that the error message you are seeing ("edition of SQL Server"), is talking about the destination (Lakehouse SQL Analytics Endpoint) and not about the source (on-prem SQL Server). 

 

It seems you are trying to write to the Lakehouse SQL Analytics Endpoint. Isn't the Lakehouse SQL Analytics Endpoint read-only? 

 

What happens if you try to use a Data Warehouse table as destination instead of a Lakehouse SQL Analytics Endpoint table?

 

It seems to me you are trying to use the Azure SQL Database connector for the destination (because the Azure SQL Database connector has the Upsert option), but you are providing the connection string to a Lakehouse SQL Analytics Endpoint as the destination. Is this what you are wanting to do (i.e. upsert data into a Lakehouse table)? 

 

I am not sure if that is supposed to work in Fabric Data Pipeline
(same for using a Data Warehouse connection string with Azure SQL Database connector, I am not sure if that is supposed to work also).

 

It can be interesting to experiment and test and see what happens. Although I don't know if it is supported.

 

 

I guess if you want to do upsert into a Fabric Lakehouse table, you can use Notebook (shown in this exercise https://microsoftlearning.github.io/mslearn-fabric/Instructions/Labs/03b-medallion-lakehouse.html)

That's a good question.  Since the SQL endpoint is available in the dropdown list for Connection in the Destination tab, I don't see any reason why not use it.  Now, if it is read-only even from a Copy data activity process, then the dev team responsible for the product should remove the SQL endpoint as a Connection choice from the list, because as long as it is there, the implicit message is that we can use it for Insert or Upsert (according to the GUI)!

 

It seems to me you are trying to use the Azure SQL Database connector for the destination (because the Azure SQL Database connector has the Upsert option), but you are providing the connection string to a Lakehouse SQL Analytics Endpoint as the destination. Is this what you are wanting to do (i.e. upsert data into a Lakehouse table)? 

 Correct, yes.

 

I am not sure if that is supposed to work in Fabric Data Pipeline
(same for using a Data Warehouse connection string with Azure SQL Database connector, I am not sure if that is supposed to work also).

Well, Microsoft made it available. Therefore I assume it means it could be used.  Unless, ofc, it's a bug. 😁

Anonymous
Not applicable

Hi @Element115 ,

Based on the information you have provided, you are experiencing an error message displayed:

Global temporary tables and transient user tables in TEMPDB are not supported in this edition of SQL Server.


You can follow the steps below to test and troubleshoot the problem:

  1. First you need to verify the version of SQL Server you are using, as some versions may have limitations on certain features.
  2. If the query is using global temporary tables, consider modifying it to instead use local temporary tables (prefixed with #), which are supported by all versions of SQL Server.

If there is still a problem, could you please provide some information such as what you did before the error occurred and has the problem occurred before?

 

Best Regards,

Ada Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

Ticket #2407230040010740

1. Where is the documentation page that mentions which SQL Server version is not compatible with the pipeline Copy data Upsert? 

 

This is the version of the on-prem server:


Microsoft SQL Server 2017 (RTM-CU31-GDR) (KB5021126) - 14.0.3460.9 (X64)
Jan 25 2023 08:42:43
Copyright (C) 2017 Microsoft Corporation
Standard Edition (64-bit) on Windows Server 2016 Standard 10.0 <X64> (Build 14393: ) (Hypervisor)
EngineEditon: 2

 

2. The query in the Source tab of the Copy data activity DOES NOT explicitly use any temp tables, global or otherwise; here is the query:

SELECT *
FROM 
	vIVDS_TX_10CM_HEIGHT_BINS
WHERE 
	[DATE] >= '@{variables('latest_date_ivds_tx_10cm_height_bins')}'

 

So in your opinion, what causes this issue?

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.