Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
AwadFabric
Regular Visitor

Copy job to on-prem Oracle fails if source has more rows than batch size

Hello everybody,

 

We are transitioning our data pipelines from Synapse to Fabric and have encountered an error that wasn't present in Synapse.

 

The pipeline that we implemented in Fabric uses a copy activity with ADLS Gen2 as source and an on-prem Oracle Database as the destination. We are experiencing the error that if the source data contains more rows than specified in the "Write batch size" option, the copy activity fails:

Failure happened on 'destination' side. ErrorCode=OracleTableNotExistError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The specified table <OUR_TABLE_NAME> doesn't exist.,Source=Microsoft.DataTransfer.Connectors.OracleV2Core,'

However, the copy activity actually writes data to the table with the number of rows of the specified batch size to the Oracle table before raising the error.

 

Since we used the same table as a destination in our Synapse pipeline, we can assume that the error is not on the database-side. 

 

Here are some screenshots with test data that was used to recreate this behavior. The test data has 100 rows of data:

 

1) Working pipeline if the batch size is bigger than the source data:

 

AwadFabric_0-1729780926177.png

AwadFabric_1-1729780968172.png

AwadFabric_2-1729781004234.png

 

2) Error if batch size if less than rows in source dataset:

AwadFabric_3-1729781065584.png

 

AwadFabric_0-1729781226266.png

 

AwadFabric_5-1729781123514.png

 

 

Attempted solutions include:

  • Enable staging in copy activity
  • Limit number of max. concurrent connections to 1
  • Set "Degree of copy parallelism" to 1
  • Trying another source data set and other source data types (tried JSON, CSV)

All of these attempted solutions gave the same error.

 

As a temporary workaround, we set the batch size to a maximum. However, this batch size seems to be capped at around one million rows, and datasets larger than this threshold still result in errors

 

Does anyone have insights on why this error occurs or how to fix it? Any help would be greatly appreciated!

Thank you!

5 REPLIES 5
IntegrateGuru
Advocate I
Advocate I

Have you tried setting a value for the 'write batch timeout' on your destination config?

I don't see a default value listed anywhere so I'm not sure how it will behave when it has written one batch and is waiting to write another batch. Maybe it isn't waiting at all, and closing the connection before the entire transfer is complete?

Try setting it to something like 00:01:00 for small batch sizes, or really however long you think it should/could reasonbly take to write the number of rows in your batch.

Hello @IntegrateGuru, thank you for your idea. I tried it again using different times for write batch timeout on the destination. I tried 1 second, 10 seconds, 30 seconds, 1 minute, and 10 minutes for a batch size of 15 rows. However, for all of these runs, the original error occured a few seconds after the pipeline was started.

Anonymous
Not applicable

HI @AwadFabric,

I'd like to suggest you take al look the following document about data factory feature limitations if they meets to your scenario:

Data Factory limitations overview - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

Hello @Anonymous, thank you for your suggestion. Unfortunately, I cannot see a solution to the problem in there. 

Anonymous
Not applicable

Hi @AwadFabric,

Perhaps you can take a look the following link that told about the similar issue if it help with your scenario:

Copy activity successfully loads more rows than Write Batch Size in Azure pipeline - Microsoft Q&A

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.