Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello Team,
We are using Snowflake as our data warehousing tool, with a report that queries a 51-million-record table. While the report refreshes successfully in Power BI Desktop, it consistently fails in Power BI Service with the following error:
Data source errorWe're sorry, an error occurred during evaluation.;[Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host. [Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host. [Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host.. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action. Table: TABLE_NAME.
Current Observations:
The Snowflake gateway connection timeout is set to a sufficiently high value.
Despite this, the error persists.
Question:
What could be causing this connection interruption, and how can we resolve it?
Next Steps Requested:
Are there additional Snowflake-side settings (e.g., session timeouts, network policies) that could override the gateway timeout?
Could this be related to Power BI Service limitations (e.g., query execution timeouts, data volume thresholds)?
Are there known issues with large dataset refreshes in Power BI Service when using Snowflake’s Arrow format?
Thank you for your support in investigating this issue.
Hello,
We are suffering from the same problem:
Error Message:
[Snowflake] arrow/ipc: could not read message body: read tcp : wsarecv: An existing connection was forcibly closed by the remote host.
Stack Trace:
Microsoft.Mashup.Host.Document.SerializedException
It is occurring when we are using a custom query in Power BI on the source table. However if we directly fetch all the fields the problem disappears, without the need to modify any of the session parameters or network configuration. Can you think why this could be happening? Thanks in advance
Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution so other members can easily find it.
Thank You
Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution so other members can easily find it.
Thank You
Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank You
The error you're encountering—where Power BI Service fails to refresh a large table from Snowflake while Power BI Desktop succeeds—typically points to a combination of network constraints, service timeouts, and possibly Snowflake-side configurations. While your Snowflake gateway timeout is set sufficiently high, Power BI Service enforces its own timeout and resource limits which can differ from Power BI Desktop. In this case, the error message suggests that the connection was forcibly closed during the transmission of a large result set, possibly due to an interruption in the Arrow IPC stream used for data exchange. This could be caused by factors such as Power BI Service’s timeout limits (which generally cap at 2 hours for Premium and shorter for shared capacity), the sheer size of the dataset being returned, or Snowflake’s internal settings like `STATEMENT_TIMEOUT_IN_SECONDS`, network policies, or idle session termination configurations. Furthermore, Power BI Service may have limitations with large result sets, especially when using Snowflake’s Arrow format, which is optimized but can be sensitive to connection instability over long durations. As next steps, you should consider optimizing your Snowflake query to reduce the dataset size—either through incremental loading, query folding, or filtering in Power Query. Additionally, review Snowflake’s network policies and session timeout parameters that might override the gateway settings. If you're using Power BI Premium, consider adjusting workload settings and evaluating whether the query is exceeding capacity limits. Also, check if switching the connection method from Arrow to ODBC or native connector affects stability. Addressing the issue will likely require coordination between Snowflake configuration tuning and Power BI Service optimization to support large-scale data loads reliably.
Hi @RajeshKannanS ,
The error usually occurs because the Snowflake connection is forcibly closed due to session timeouts, network interruptions, or Power BI Service query execution limits. Even if your gateway timeout is high, Snowflake or network policies might close idle or long-running sessions.
Snowflake-side settings to check:
Power BI Service considerations:
Recommendations:
Can you please have a look at the latest snowflake driver improvements in to see if it resolves your issue?
Power Query Snowflake connector - Power Query | Microsoft Learn
User | Count |
---|---|
47 | |
32 | |
30 | |
27 | |
26 |
User | Count |
---|---|
56 | |
55 | |
36 | |
33 | |
28 |