Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
I encounter a problem accessing Fabrice Warehouse by Notebook (also in the same workspace) via ODBC Driver through the Application (SP).
The SP has created and granted the necessary API permission.
SP also add to workspace Manage Access as Member Role.
The notebook code likes below.
from azure.identity import ClientSecretCredential
import pyodbc
import struct
# Configuration - replace these with your values
tenant_id = "your-azure-ad-tenant-id"
client_id = "your-service-principal-client-id"
client_secret = "your-service-principal-secret"
server_name = "your-server-name.fabric.microsoft.com" # e.g., "xxxxxx.fabric.microsoft.com"
database_name = "your-warehouse-name"
# 1. Get an access token using service principal credentials
credential = ClientSecretCredential(
tenant_id=tenant_id,
client_id=client_id,
client_secret=client_secret
)
# The scope for Fabric SQL analytics endpoints
scope = "https://api.fabric.microsoft.com/.default"
# Get the access token
access_token = credential.get_token(scope).token
# 2. Connect to the warehouse using pyodbc with the access token
try:
# Connect using the access token
conn = pyodbc.connect(
f"Driver={{ODBC Driver 17 for SQL Server}};"
f"Server={server_name};"
f"Database={database_name};",
attrs_before={
1256: bytearray(struct.pack("<i", len(access_token))) + access_token.encode("utf-16-le")
}
)
# Create a cursor and execute a query
cursor = conn.cursor()
cursor.execute("SELECT 1 AS test")
# Fetch and print results
rows = cursor.fetchall()
for row in rows:
print(row)
except Exception as e:
print(f"Error: {str(e)}")
finally:
if 'conn' in locals():
conn.close()
I ensure all values are correct, but always get authentication failed.... I do researches and try the solution online, but still stuck at this problem...
Connection error: ('28000', '[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Could not login because the authentication failed. (18456) (SQLDriverConnect)')
Solved! Go to Solution.
There is a way to execute UPDATE or INSERT statements on Microsoft Fabric Data Warehouse (formerly Synapse Data Warehouse) directly from a notebook — as long as both the notebook and warehouse are in the same Fabric workspace.
If you're working in a Lakehouse or Warehouse notebook in Fabric, you can use:
%%sql
UPDATE my_table
SET column1 = 'new_value'
%%sql
INSERT INTO my_table (id, name)
VALUES (101, 'Alice');
Option 2: Use spark.sql() (if supported by context)
spark.sql("""
UPDATE my_table
SET column1 = 'new_value'
WHERE id = 123
""")
This only works if:
The notebook is connected to the warehouse.
You're using the SQL engine context, not Spark-only tables.
Make sure the table is a SQL Warehouse table, not a Delta/Lakehouse table if you're doing standard SQL.
Use USE WAREHOUSE or %sql USE WAREHOUSE if needed to switch context.
Security: Ensure you have the necessary permissions to UPDATE or INSERT.
Hi @DoraAtRice, Microsoft Fabric Warehouses currently only support user-based AAD authentication (interactive or delegated tokens). That means:
Service Principals (client credentials flow) are not yet supported when connecting via SQL endpoints (ODBC, JDBC, etc.).
Even if your SP has:
API permissions
Workspace Member role
Valid access token
You will still get authentication failure when using it via ODBC.
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
Hi,
Thank you for the response.
Is there a way or solution that I can execute the SQL query (UPDATE/INSERT) to update Fabric DW via Notebook? Since the notebook and DW are in the same workspace, is there any direct way to do so?
Thank you for the help!
There is a way to execute UPDATE or INSERT statements on Microsoft Fabric Data Warehouse (formerly Synapse Data Warehouse) directly from a notebook — as long as both the notebook and warehouse are in the same Fabric workspace.
If you're working in a Lakehouse or Warehouse notebook in Fabric, you can use:
%%sql
UPDATE my_table
SET column1 = 'new_value'
%%sql
INSERT INTO my_table (id, name)
VALUES (101, 'Alice');
Option 2: Use spark.sql() (if supported by context)
spark.sql("""
UPDATE my_table
SET column1 = 'new_value'
WHERE id = 123
""")
This only works if:
The notebook is connected to the warehouse.
You're using the SQL engine context, not Spark-only tables.
Make sure the table is a SQL Warehouse table, not a Delta/Lakehouse table if you're doing standard SQL.
Use USE WAREHOUSE or %sql USE WAREHOUSE if needed to switch context.
Security: Ensure you have the necessary permissions to UPDATE or INSERT.
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
5 | |
4 | |
3 | |
2 | |
2 |