Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I'm trying out the example https://learn.microsoft.com/en-us/fabric/data-factory/apache-airflow-jobs-dbt-fabric
but I'm stuck at the airflow dbt job failing on
('08S01', '[08S01] [Microsoft][ODBC Driver 18 for SQL Server]TCP Provider: Error code 0x2746 (10054) (SQLDriverConnect)')
which according to e.g. https://stackoverflow.com/questions/74708033/error-code-0x2746-10054-when-trying-to-connect-to-sql-s... seems to be related to certificate trust.
Anyone knows how to proceed?
[2024-11-26, 10:23:26 UTC] {subprocess.py:90} INFO - [0m10:23:26 Finished running in 0 hours 0 minutes and 3.69 seconds (3.69s).
[2024-11-26, 10:23:26 UTC] {subprocess.py:90} INFO - [0m10:23:26 Encountered an error:
[2024-11-26, 10:23:26 UTC] {subprocess.py:90} INFO - Database Error
[2024-11-26, 10:23:26 UTC] {subprocess.py:90} INFO - ('08S01', '[08S01] [Microsoft][ODBC Driver 18 for SQL Server]TCP Provider: Error code 0x2746 (10054) (SQLDriverConnect)')
[2024-11-26, 10:23:26 UTC] {subprocess.py:94} INFO - Command exited with return code 2
[2024-11-26, 10:23:26 UTC] {taskinstance.py:1824} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/cosmos/operators/local.py", line 277, in execute
result = self.build_and_run_cmd(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/cosmos/operators/local.py", line 189, in build_and_run_cmd
return self.run_command(cmd=dbt_cmd, env=env, context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/cosmos/operators/local.py", line 180, in run_command
self.exception_handling(result)
File "/home/airflow/.local/lib/python3.8/site-packages/cosmos/operators/local.py", line 76, in exception_handling
raise AirflowException(
airflow.exceptions.AirflowException: ('dbt command failed. The command returned a non-zero exit code 2. Details: ', '\x1b[0m10:23:22 Running with dbt=1.5.11', '\x1b[0m10:23:22 Registered adapter: fabric=1.5.0', '\x1b[0m10:23:22 Found 1 model, 0 tests, 0 snapshots, 0 analyses, 347 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics, 0 groups', '\x1b[0m10:23:22', '\x1b[0m10:23:26', '\x1b[0m10:23:26 Finished running in 0 hours 0 minutes and 3.69 seconds (3.69s).', '\x1b[0m10:23:26 Encountered an error:', 'Database Error', " ('08S01', '[08S01] [Microsoft][ODBC Driver 18 for SQL Server]TCP Provider: Error code 0x2746 (10054) (SQLDriverConnect)')")
[2024-11-26, 10:23:26 UTC] {taskinstance.py:1345} INFO - Marking task as FAILED. dag_id=dbt_fabric_dag, task_id=nyc_trip_count.nyc_trip_count_run, execution_date=20241125T000000, start_date=20241126T102313, end_date=20241126T102326
[2024-11-26, 10:23:26 UTC] {standard_task_runner.py:104} ERROR - Failed to execute job 21 for task nyc_trip_count.nyc_trip_count_run (('dbt command failed. The command returned a non-zero exit code 2. Details: ', '\x1b[0m10:23:22 Running with dbt=1.5.11', '\x1b[0m10:23:22 Registered adapter: fabric=1.5.0', '\x1b[0m10:23:22 Found 1 model, 0 tests, 0 snapshots, 0 analyses, 347 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics, 0 groups', '\x1b[0m10:23:22', '\x1b[0m10:23:26', '\x1b[0m10:23:26 Finished running in 0 hours 0 minutes and 3.69 seconds (3.69s).', '\x1b[0m10:23:26 Encountered an error:', 'Database Error', " ('08S01', '[08S01] [Microsoft][ODBC Driver 18 for SQL Server]TCP Provider: Error code 0x2746 (10054) (SQLDriverConnect)')"); 69)
[2024-11-26, 10:23:26 UTC] {local_task_job_runner.py:225} INFO - Task exited with return code 1
[2024-11-26, 10:23:26 UTC] {taskinstance.py:2653} INFO - 0 downstream tasks scheduled from follow-on schedule check
Stuck on the same issue. Anyone able to resolve and complete the tutorial withour error?
Hi @jawa
Here are some steps for your reference:
Please update TLS Settings, ensure that both the client and server are using TLS 1.2 or higher. You can update the TLS settings on your client machine to enforce the use of TLS 1.2
Verify that your SQL Server is configured to accept connections using TLS 1.2. You can do this by checking the SQL Server Configuration Manager and ensuring that TLS 1.2 is enabled
Update the ODBC Driver, make sure you have the latest version of the ODBC Driver 18 for SQL Server installed. Sometimes, updating the driver can resolve compatibility issues.
Check for any network issues that might be causing the connection to be forcibly closed. Ensure that there are no firewalls or network policies blocking the connection.
Review your connection string to ensure it includes the correct parameters for TLS. You might need to add `TrustServerCertificate=True` to your connection string if you're using a self-signed certificate.
An existing connection was forcibly closed (OS error 10054) - SQL Server | Microsoft Learn
Best Regards
Zhengdong Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
@AnonymousI can replicate the error using "Entra Service Principal" login option in SSMS which is similar to the way Airflow tries to connect.
So this thread is probably more related to how to access "fabric analytics sql endpoint " using "entra service principal ".
Hi @Anonymous
the steps you have propopsed are reasonable. But I think you're missing that both the sending and the recieving end of the ODBC connection is hosted in Fabric and out of my control.
So basically the Linux VM that is hosting the Fabric Airflow job
probably needs to be configured so that it can handle SQL server ODBC connection with driver version 18, and I can't do that.
I tried to follow the tutorial here https://learn.microsoft.com/en-us/fabric/data-factory/apache-airflow-jobs-dbt-fabric when the error occured
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.