Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
server_name =
database_name =
jdbc_url = f"jdbc:sqlserver://{server_name}:1433;database={database_name};loginTimeout=30;"
access_token = notebookutils.credentials.getToken("pbi")
connection_properties = {
"accessToken": access_token
}
spark read or write jdbc(jdbc_url, "<MY_TABLE>", read or append, properties=connection_properties)
Hi @P_work,
Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @dlevy and @spaceman127 for prompt and helpful responses.
Just checking, have you had a chance to open a support ticket, as suggested by @dlevy . If so, we'd love to hear the current status or any updates from that.
If the issue was resolved through the support ticket, it would be great if you could share the solution here as well. It could really help other community members find answers more quickly.
Warm regards,
Prasanna Kumar
Hi @P_work
I would like to recreate the problem.
Can you tell me more about your setup?
There are several factors involved, and it would be helpful if we knew what they were.
Best regards
I investigated the payload. The token expires in an hour. My notebook job completes in well under an hour. I have multiple instance of this notebook running (3). Each notebooks token is distinct. All three notebook instances also complete in well under an hour.
@P_work - well, guess that rules out the easy fix. 😊
Let's get a support case open so we can have people dig into logs and see what is going on. You can create a new support case by following these instructions: Create a Fabric and Power BI Support Ticket - Power BI | Microsoft Learn
Hi @P_work - I'm Dave from the SQL drivers team.
Any chance this is happening when the notebook takes longer than an hour to run?