Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Error: Specified Cast is not valid / Data Pipeline Copy Activity
Source: SQL endpoint (Another Lakehouse SQL endpoint)
Destination: Lakehouse
The Key Column and Modified_datetime columns were created to use the Upsert of the data pipeline Copy Activity.
Append, Overwrite operation works normally, but 'Specified cast is not valid' error occurs when Upsert is performed.
The column mapping operation was performed, and the same error occurs when you change the data type in a different way.
I can't find the details of the error in fabric.
Is it not stabilized yet because it's a preview?
[Error original]
ErrorCode=FailedToUpsertDataIntoDeltaTable,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Hit an error when upsert data to table in Lakehouse. Error message: Specified cast is not valid.,Source=Microsoft.DataTransfer.Connectors.LakehouseTableConnector,''Type=System.InvalidCastException,Message=Specified cast is not valid.,Source=Microsoft.DataTransfer.Connectors.LakehouseTableConnector,'
Solved! Go to Solution.
Hi @RRASUK9900 , Thank you for reaching out to the Microsoft Community Forum.
This error usually means the source and Lakehouse table schemas don’t fully match, most often in key_column or Modified_datetime. Even if the types look correct, hidden issues like nulls or invalid formats (especially in dates) can break Fabric’s internal casting during Upsert.
Since the Upsert feature in Fabric is still in preview, its error messages are vague and unlike Append or Overwrite, it uses merge logic that’s more fragile. That’s why those modes work while Upsert fails, the problem likely occurs during internal comparisons or casts in the merge.
The best thing to do right now is to avoid Upsert in the Copy Activity. Instead, load your data into a staging Lakehouse table with Append mode, then run a manual MERGE in a Spark Notebook. This gives you full control, better visibility and clearer errors. Just be sure to cast key columns explicitly in your source query to match the destination types and test with a small dataset first to catch edge cases early.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
Hi @RRASUK9900 , Thank you for reaching out to the Microsoft Community Forum.
This error usually means the source and Lakehouse table schemas don’t fully match, most often in key_column or Modified_datetime. Even if the types look correct, hidden issues like nulls or invalid formats (especially in dates) can break Fabric’s internal casting during Upsert.
Since the Upsert feature in Fabric is still in preview, its error messages are vague and unlike Append or Overwrite, it uses merge logic that’s more fragile. That’s why those modes work while Upsert fails, the problem likely occurs during internal comparisons or casts in the merge.
The best thing to do right now is to avoid Upsert in the Copy Activity. Instead, load your data into a staging Lakehouse table with Append mode, then run a manual MERGE in a Spark Notebook. This gives you full control, better visibility and clearer errors. Just be sure to cast key columns explicitly in your source query to match the destination types and test with a small dataset first to catch edge cases early.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.