Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
In adf one of the data flow is failed due to transist error in the middle of execution.the data flow is having insert ,update and delete operation based on scd2 scenario.To avoid the transist failures we have applied retry option.But while retrying second attempt there is some data loaded in the sink table already in the first attempt and causes duplicate records in the table during the second run for insert scenario.Is there any option to restart the job from the point of failure during the second retry.
I have not implemented cdc at source level that's the reason it seems I did not see any option at sink setting about check points to avoid duplicate Dara.
Please suggest.
Thanks in advance.
Solved! Go to Solution.
Hi @Raj_030 ,
Thanks for using Fabric Community.
At this time we handle only Fabric related issues. As the above ask is related to Azure Data Factory, I request you to open a new ticket here - Ask a question - Microsoft Q&A
You would get a better help if you can open a ticket in above mentioned platform.
Hope this is helpful.
Hi @Raj_030 ,
Thanks for using Fabric Community.
At this time we handle only Fabric related issues. As the above ask is related to Azure Data Factory, I request you to open a new ticket here - Ask a question - Microsoft Q&A
You would get a better help if you can open a ticket in above mentioned platform.
Hope this is helpful.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.