Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
mhafizzul
Regular Visitor

Error code 2200: BigQuery timeout after 5 min

Hi,

 

I have a pipeline, and in one of its copy activity, I got time out error as per below.

 

 

Failure happened on 'Source' side. 'Type=Microsoft.DI.Connector.GoogleBigQuery.ExceptionUtils.GoogleBigQueryConnectorException,Message=Query timed out, ErrorCode: InternalError,Source=Microsoft.DI.Connector.GoogleBigQuery,''Type=System.TimeoutException,Message=Query timed out,Source=Microsoft.DI.Connector.GoogleBigQuery,'

 

 

The problem is, it always time out after 5 minutes. There are several table that I know need more than 5 minutes to get the data. When I check timeout setting for the activity it is already defaulted to 12 hours. (Refer below screenshot)

 

mhafizzul_0-1724122724916.png

 

I also set 10 minutes timeout in "Connection timeout duration" and "Command timeout duration" in the connection, but it still timeout after 5 minutes

 

mhafizzul_1-1724122833467.png

 

After doing all of the above, my copy data activity still timeout after 5 minutes (refer below screenshot). Is there any other place that I should change the timeout setting?

mhafizzul_2-1724122980178.png

 

Thanks a lot!!!

 

3 REPLIES 3
shedstar
New Member

Did you ever resolve this? I've got exactly the same issue on a copy activity in Synapse between BigQuery and ADLSGen2. There's clearly a 5 minute limit....somewhere. But I can't for the life of me find it.

mhafizzul
Regular Visitor

Hi @Anonymous ,

 

Thanks for the answer. Somehow we manage to optimize the query so we do not hit timeout anymore. However, just for knowledge, is there anyway to increase the BQ timeout settings? (Now it seems to timeout when BQ query do not complete after 5 minutes)

 

Thanks.

Anonymous
Not applicable

Hi, @mhafizzul 

 

See if the following reasons fit your situation:

  • When you create Excel dataset and import schema from connection/store, preview data, list, or refresh worksheets, you may hit timeout error if the excel file is large in size.

  • When you use copy activity to copy data from large Excel file (>= 100 MB) into other data store, you may experience slow performance or OOM issue.

Resolution:

  • For importing schema, you can generate a smaller sample file, which is a subset of original file, and choose "import schema from sample file" instead of "import schema from connection/store".

  • For listing worksheet, in the worksheet dropdown, you can click "Edit" and input the sheet name/index instead.

  • To copy large excel file (>100 MB) into other store, you can use Data Flow Excel source which sport streaming read and perform better.

You can learn more about it in this document: Troubleshoot copy activity performance - Azure Data Factory & Azure Synapse | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

Top Solution Authors