Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I have a pipeline, and in one of its copy activity, I got time out error as per below.
Failure happened on 'Source' side. 'Type=Microsoft.DI.Connector.GoogleBigQuery.ExceptionUtils.GoogleBigQueryConnectorException,Message=Query timed out, ErrorCode: InternalError,Source=Microsoft.DI.Connector.GoogleBigQuery,''Type=System.TimeoutException,Message=Query timed out,Source=Microsoft.DI.Connector.GoogleBigQuery,'
The problem is, it always time out after 5 minutes. There are several table that I know need more than 5 minutes to get the data. When I check timeout setting for the activity it is already defaulted to 12 hours. (Refer below screenshot)
I also set 10 minutes timeout in "Connection timeout duration" and "Command timeout duration" in the connection, but it still timeout after 5 minutes
After doing all of the above, my copy data activity still timeout after 5 minutes (refer below screenshot). Is there any other place that I should change the timeout setting?
Thanks a lot!!!
Did you ever resolve this? I've got exactly the same issue on a copy activity in Synapse between BigQuery and ADLSGen2. There's clearly a 5 minute limit....somewhere. But I can't for the life of me find it.
Hi @Anonymous ,
Thanks for the answer. Somehow we manage to optimize the query so we do not hit timeout anymore. However, just for knowledge, is there anyway to increase the BQ timeout settings? (Now it seems to timeout when BQ query do not complete after 5 minutes)
Thanks.
Hi, @mhafizzul
See if the following reasons fit your situation:
When you create Excel dataset and import schema from connection/store, preview data, list, or refresh worksheets, you may hit timeout error if the excel file is large in size.
When you use copy activity to copy data from large Excel file (>= 100 MB) into other data store, you may experience slow performance or OOM issue.
Resolution:
For importing schema, you can generate a smaller sample file, which is a subset of original file, and choose "import schema from sample file" instead of "import schema from connection/store".
For listing worksheet, in the worksheet dropdown, you can click "Edit" and input the sheet name/index instead.
To copy large excel file (>100 MB) into other store, you can use Data Flow Excel source which sport streaming read and perform better.
You can learn more about it in this document: Troubleshoot copy activity performance - Azure Data Factory & Azure Synapse | Microsoft Learn
Best Regards,
Community Support Team _Charlotte
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.