Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
RajeshKannanS
New Member

Power BI data refreshed issue

Hello Team,

 

We are using Snowflake as our data warehousing tool, with a report that queries a 51-million-record table. While the report refreshes successfully in Power BI Desktop, it consistently fails in Power BI Service with the following error:

Data source errorWe're sorry, an error occurred during evaluation.;[Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host. [Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host. [Snowflake] arrow/ipc: could not read message body: read tcp TCP_ADDRESSS->TCP_ADDRESSS: wsarecv: An existing connection was forcibly closed by the remote host.. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action. Table: TABLE_NAME.

 

Current Observations:

 

The Snowflake gateway connection timeout is set to a sufficiently high value.

 

Despite this, the error persists.

 

Question:

What could be causing this connection interruption, and how can we resolve it?

 

Next Steps Requested:

 

Are there additional Snowflake-side settings (e.g., session timeouts, network policies) that could override the gateway timeout?

 

Could this be related to Power BI Service limitations (e.g., query execution timeouts, data volume thresholds)?

 

Are there known issues with large dataset refreshes in Power BI Service when using Snowflake’s Arrow format?

 

Thank you for your support in investigating this issue.

 

7 REPLIES 7
fparis
New Member

Hello,

We are suffering from the same problem:

Error Message:
[Snowflake] arrow/ipc: could not read message body: read tcp : wsarecv: An existing connection was forcibly closed by the remote host.
Stack Trace:
Microsoft.Mashup.Host.Document.SerializedException

It is occurring when we are using a custom query in Power BI on the source table. However if we directly fetch all the fields the problem disappears, without the need to modify any of the session parameters or network configuration. Can you think why this could be happening? Thanks in advance

v-sdhruv
Community Support
Community Support

Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution  so other members can easily find it.
Thank You

v-sdhruv
Community Support
Community Support

Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution  so other members can easily find it.
Thank You

v-sdhruv
Community Support
Community Support

Hi @RajeshKannanS ,
Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank You

Poojara_D12
Super User
Super User

Hi @BhavinVyas3003 

The error you're encountering—where Power BI Service fails to refresh a large table from Snowflake while Power BI Desktop succeeds—typically points to a combination of network constraints, service timeouts, and possibly Snowflake-side configurations. While your Snowflake gateway timeout is set sufficiently high, Power BI Service enforces its own timeout and resource limits which can differ from Power BI Desktop. In this case, the error message suggests that the connection was forcibly closed during the transmission of a large result set, possibly due to an interruption in the Arrow IPC stream used for data exchange. This could be caused by factors such as Power BI Service’s timeout limits (which generally cap at 2 hours for Premium and shorter for shared capacity), the sheer size of the dataset being returned, or Snowflake’s internal settings like `STATEMENT_TIMEOUT_IN_SECONDS`, network policies, or idle session termination configurations. Furthermore, Power BI Service may have limitations with large result sets, especially when using Snowflake’s Arrow format, which is optimized but can be sensitive to connection instability over long durations. As next steps, you should consider optimizing your Snowflake query to reduce the dataset size—either through incremental loading, query folding, or filtering in Power Query. Additionally, review Snowflake’s network policies and session timeout parameters that might override the gateway settings. If you're using Power BI Premium, consider adjusting workload settings and evaluating whether the query is exceeding capacity limits. Also, check if switching the connection method from Arrow to ODBC or native connector affects stability. Addressing the issue will likely require coordination between Snowflake configuration tuning and Power BI Service optimization to support large-scale data loads reliably.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS
BhavinVyas3003
Solution Sage
Solution Sage

Hi @RajeshKannanS ,

 

The error usually occurs because the Snowflake connection is forcibly closed due to session timeouts, network interruptions, or Power BI Service query execution limits. Even if your gateway timeout is high, Snowflake or network policies might close idle or long-running sessions.

Snowflake-side settings to check:

  • Increase Snowflake CLIENT_SESSION_KEEP_ALIVE to prevent session timeouts during long queries.
  • Review and adjust network policies or firewall rules that might terminate idle TCP connections.
  • Check Snowflake’s query timeout and warehouse auto-suspend settings to ensure the warehouse stays active.

Power BI Service considerations:

  • Power BI Service imposes a maximum refresh timeout of 2 hours; queries taking longer will fail.
  • Large datasets (51 million rows) can lead to heavy resource consumption; consider incremental refresh to reduce load.
  • Snowflake’s Arrow format is usually stable, but if errors persist, try switching to DirectQuery or using a different connector/driver version.

Recommendations:

  • Enable incremental refresh in Power BI to reduce dataset size per refresh.
  • Coordinate with your Snowflake admin to adjust session keep-alive and network settings.
  • Monitor and optimize your Snowflake warehouse size and query performance.
  • If possible, test refreshes with smaller data subsets to isolate issues.

 


Thanks,
Bhavin
Problem solved? Hit “Accept as Solution” and high-five me with a Kudos! Others will thank you later!
GilbertQ
Super User
Super User

Hi @RajeshKannanS 

 

Can you please have a look at the latest snowflake driver improvements in to see if it resolves your issue? 

Power Query Snowflake connector - Power Query | Microsoft Learn

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.