Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
amaaiia
Continued Contributor
Continued Contributor

I can preview data of a stored procedure from SQL Server on-prem but error when trying to write it

I have some stored procedures in a SQL Server. I ingest this data with Data Pipeline >COPY activity. I have an on-prem data gateway to access the data. I can ingest the data from most of the stored procedures and write it into a lakehouse table (destination). However, I'm having issues with one of them, that I can't write it into a Lakehouse table.

 

I can PREVIEW this problematic stored procedure, that is, in SOURCE tab, I click on preview and I can see tha data at source. However, when I save+run the Data pipeline, I get an error:

amaaiia_1-1720075523989.png

 

ErrorCode=SourceColumnIsNotDefinedInDeltaMetadata,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Source column is not defined in delta metadata. Column name: Id, file name: .,Source=Microsoft.DataTransfer.ClientLibrary,'

 

The destination table IS NEW. It's not and old table. I'm creating a table from start, so it doesn't exist at the begining.

Any ideas about why is happening this?

1 ACCEPTED SOLUTION
amaaiia
Continued Contributor
Continued Contributor

I think the problem is that my stored procedure returns one schema or another depending on the parameter is passed. I understand that Data Pipeline is interpreting a schema because it doesn't look at the parameter that is being passed when defining the schema, and then it actually receives the data with the other schema.

 

I guess this is what is happening because I have modified the stored procedure so that only one parameter can be passed and it works. 

 

It's quite frustrating that. Data pipeline should check which parameter is being sent in order to define target schema...

View solution in original post

7 REPLIES 7
v-nuoc-msft
Community Support
Community Support

Hi @amaaiia 

 

You seem to have a metadata mismatch problem.

 

Please try the following:

 

When creating the new Lakehouse table, explicitly define the schema to match the source data. This can help avoid any automatic schema inference issues.

 

In the activity, ensure that you have correctly mapped the source columns to the destination columns. Sometimes, automatic mapping might not work as expected, especially if there are discrepancies in column names or data types.

 

Consider using a Data Flow to handle the transformation and loading of data.

 

How to copy data using copy activity - Microsoft Fabric | Microsoft Learn

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

amaaiia
Continued Contributor
Continued Contributor

Hi @v-nuoc-msft ,

I think Dataflow is not available for stored procedures..

With Mapping option it works, however, I can't use it because the ingest of the tables is inside a loop for me, so I can't specify the schema of each table I read. That is, I don't have a COPY activity for each table, but I have a loop with a COPY activity inside that I use to ingest all the tables through parameters...

 

Is this mismatch an error related to data in source or is it a problem from fabric? I'd like to solve the issue if it might be something about source.

 

Hi @amaaiia 

 

The error message "Source column not defined in incremental metadata" indicates that the column ID in the source data does not match the expected schema in the target table.

 

Because you can preview the data, you can verify that the column names and data types match the expected schema.

 

Make sure there are no special characters or Spaces in the column names that could cause problems.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

amaaiia
Continued Contributor
Continued Contributor

But I don't understand why ID in source doesn't match ID in target, because target table doesn't exists yet, so it's not possible not to match, there's not an schema created in target yet.

Hi @amaaiia 

 

Even though the target table doesn’t exist initially, the Data Pipeline might be inferring the schema based on the first few rows of data. 

amaaiia
Continued Contributor
Continued Contributor

I think the problem is that my stored procedure returns one schema or another depending on the parameter is passed. I understand that Data Pipeline is interpreting a schema because it doesn't look at the parameter that is being passed when defining the schema, and then it actually receives the data with the other schema.

 

I guess this is what is happening because I have modified the stored procedure so that only one parameter can be passed and it works. 

 

It's quite frustrating that. Data pipeline should check which parameter is being sent in order to define target schema...

Hi @amaaiia 

 

That's a good idea.

 

Can you tell me if your problem is solved? If yes, please accept it as solution.

 

Regards,

Nono Chen

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.