Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.

Reply
amaaiia
Super User
Super User

Overwrite of SQL Server table into lakehouse fails

Hi,

I have a Data Pipeline in Production that COPIES SQL Server tables into Lakehouse tables. The Pipeline does an overwrite of each table every time. All tables work well except one.

 

1. I count rows with preview:

amaaiia_0-1741248597130.png

2. I configure source and I preview data correctly:

amaaiia_1-1741248647979.png

 

3. I configure destination:

amaaiia_2-1741248668863.png

 

4. Run pipeline and it fails:

amaaiia_3-1741248704787.png

 

I've done some testing:

  • If I run a bit different query that returns the same number of records, it works:
    amaaiia_4-1741249144561.pngamaaiia_5-1741249174409.png

     

     

  • This query also fails:

amaaiia_6-1741249235353.png

  • This query also fails:

amaaiia_7-1741249273785.png

 

If I preview the data it works, but if I run the Pipeline it fails, always the same error. The only way I've seen it works, is when I filter the query with one filed value. I don't think is a column names issue because the query that is working selects all the columns.

 

1 ACCEPTED SOLUTION

Finally, I ask SQL Server admin to create the same table in source in a different way. We don't know which was the reason of the failure but changing the table create mode, now it's working.

View solution in original post

7 REPLIES 7
v-karpurapud
Community Support
Community Support

Hi @amaaiia 

We are following up to see if your query has been resolved. Should you have identified a solution, we kindly request you to share it with the community to assist others facing similar issues.

If our response was helpful, please mark it as the accepted solution and give a kudos, as this helps other members in  community.

 

Thank you

Finally, I ask SQL Server admin to create the same table in source in a different way. We don't know which was the reason of the failure but changing the table create mode, now it's working.

ahmetyilmaz
Advocate II
Advocate II

Try doing your own mapping manually with + New mapping. You already have access to column names and data types.

I ingest some tables inside a loop. I can't specify the schema because it's different for each table. I use the same COPY activity for all the tables, setting table_name as a variable in de COPY activity.

Can you try what I said just to test the problem? At least we can determine which column is the problem.

ahmetyilmaz
Advocate II
Advocate II

Hello, have you tried selecting the import schemas option from the mapping section?
If there is an empty column, maybe you can see it here.
ahmetyilmaz_0-1741250710362.png


 

Unable to import schema. I click on Import schemas, it says Loading but with no error message, it doesn't import it.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.