Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
If one uses pyodbc to write to and read from several Fabric SQL DB tables and read from several Lakehouse "tables" using SQL Endpoint, from within a single notebook, any caveats or pointers? Specifically connections? I am wary of using connection pool. Certainly, I can see a single connection utilized for queries to reduce load on SQL Server, perhaps with a with statement.
Solved! Go to Solution.
Hi @P_work ,
Thanks for the follow up.
As @Vinodh247 already mentioned, keeping long running global connections is not recommended because the SQL analytics endpoint can become idle and drop sessions in the background. The Microsoft documentation notes that the endpoint can be suspended after inactivity:
https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
In that case a reused pyodbc connection may fail since pyodbc does not automatically recover when the backend session is dropped.
There are similar reports in the community where idle connections to the lakehouse SQL endpoint resulted in errors:
Solved: Failed to access/connect to Lakehouse SQL Endpoint... - Microsoft Fabric Community
To reduce repeated reconnect times you can keep related queries inside the same connection block and optionally add a simple retry approach for transient disconnects.
Hope this helps. Please reach out for further assistance.
Thank you.
Hi @P_work ,
Just checking in to see if you query is resolved and if any responses were helpful.
Otherwise, feel free to reach out for further assistance.
Thank you.
One issue when not using global connections is that SQL Analytics Endpoint appears to require upwards of four seconds to connect. Any specifics regarding dropping of idle sessions? i.e. Opening a global connection and have it close when notebook session ends would alleviate this issue.
Hi @P_work ,
Thanks for the follow up.
As @Vinodh247 already mentioned, keeping long running global connections is not recommended because the SQL analytics endpoint can become idle and drop sessions in the background. The Microsoft documentation notes that the endpoint can be suspended after inactivity:
https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
In that case a reused pyodbc connection may fail since pyodbc does not automatically recover when the backend session is dropped.
There are similar reports in the community where idle connections to the lakehouse SQL endpoint resulted in errors:
Solved: Failed to access/connect to Lakehouse SQL Endpoint... - Microsoft Fabric Community
To reduce repeated reconnect times you can keep related queries inside the same connection block and optionally add a simple retry approach for transient disconnects.
Hope this helps. Please reach out for further assistance.
Thank you.
Hi @P_work ,
Just wanted to check if the response provided was helpful. If further assistance is needed, please reach out.
Thank you.
Pyodbc does not give you a native connection pool, and you do not want one inside fabric notebooks anyway. Use one connection per endpoint (one for fabric SQLDB, one for lakehouse SQL endpoint). Do not pool. Wrap every operation in a with block so the connection opens, executes, and closes cleanly. Avoid long lived or global connections because Fabric endpoints drop idle sessions and pyodbc does not auto-recover well.