Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
stiggrr87
Helper I
Helper I

Working with copy job and creating duplicates

Hi!

 

I am trying to use copy job and have setting to overwrite data, but when data sinks to lakehouse, it is still being duplicated like it was append setting.

What to do? Not to use copy job and use dataflow gen 2?

1 ACCEPTED SOLUTION
v-aatheeque
Community Support
Community Support

Hi @stiggrr87 
Thanks for raising this query in Microsoft Fabric Community Forum.

The overwrite option in Copy Job with Lakehouse may not fully truncate the existing Delta table and can behave like append in some scenarios.

 

  • As a workaround, you can use a pre-copy script (TRUNCATE/DELETE) to clear the table before ingestion. This ensures no duplication. Alternatively you can load into a staging table and use INSERT OVERWRITE logic.
  • Dataflow Gen2 is only recommended if you need transformation logic otherwise Copy Job with explicit truncate works best.

Reference : What is Copy job in Data Factory - Microsoft Fabric | Microsoft Learn

 

Hope this helps !!

Thank You.

View solution in original post

2 REPLIES 2
tayloramy
Super User
Super User

Hi @stiggrr87

Is this a new copy job or an older one? 

Previously only appending records was supprted, but now truncating is also supported, so if you create a new copy job it should be able to truncate the target table. 

https://learn.microsoft.com/en-us/fabric/data-factory/what-is-copy-job#automatic-table-creation-and-...

 

WHat I've done as a quick workaround for this is to run the copy job from a pipeline, and then before the copy job executes run either a notebook (for a lakehouse) or a script (SQL) to truncate the table first.  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Proud to be a Super User!





v-aatheeque
Community Support
Community Support

Hi @stiggrr87 
Thanks for raising this query in Microsoft Fabric Community Forum.

The overwrite option in Copy Job with Lakehouse may not fully truncate the existing Delta table and can behave like append in some scenarios.

 

  • As a workaround, you can use a pre-copy script (TRUNCATE/DELETE) to clear the table before ingestion. This ensures no duplication. Alternatively you can load into a staging table and use INSERT OVERWRITE logic.
  • Dataflow Gen2 is only recommended if you need transformation logic otherwise Copy Job with explicit truncate works best.

Reference : What is Copy job in Data Factory - Microsoft Fabric | Microsoft Learn

 

Hope this helps !!

Thank You.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.

Top Kudoed Authors