Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
I am trying to do dml including delete from notebook, i tired few options,
1. jdbc works fine, for insert operations, for delete its throws error (need credentials)
2. odbc times out frequently (need credentials)
3. java driver works fine but its too slow, like 5 ml rows to process it takes 13 hours (need credentials)
4. tried acces token method which doesnt need password, but access token get expired when spark distributes work at worker level
5. managed identity, couldnt get this to work even after mulitple attempts
isnt there any way which works seemlessly??
Solved! Go to Solution.
Hi @malimahesh,
You’re running into a few expected gotchas with notebooks + SQL Server auth in Fabric:
Some things to try
(
df.write
.format("sqlserver")
.option("host", "<server>.database.windows.net")
.option("port", "1433")
.option("database", "<db>")
.option("dbtable", "dbo.Target")
.option("authentication", "ActiveDirectoryMSI") # tokens handled per task
.mode("append")
.save()
)jdbcUrl = (
"jdbc:sqlserver://<server>.database.windows.net:1433;"
"database=<db>;"
"encrypt=true;"
"hostNameInCertificate=*.database.windows.net;"
"loginTimeout=30;"
"authentication=ActiveDirectoryMSI"
)
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
spark._sc._jvm.Class.forName(driver)
conn = spark._sc._jvm.java.sql.DriverManager.getConnection(jdbcUrl)
stmt = conn.createStatement()
stmt.executeUpdate("DELETE dbo.Target WHERE LoadDate < '2025-01-01';") # or EXEC dbo.DeleteOldRows @cutoff = ...
stmt.close(); conn.close()
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Proud to be a Super User! | |
Hi @malimahesh ,
Thank you for the update, I completely understand your need for a secure and passwordless connection, especially for large-scale archival operations across both Delta and SQL tables.
From what you’ve described, using Managed Identity (Workspace Identity) would indeed be the most seamless and secure option once it’s set up correctly, as it avoids credential exposure and token expiry issues. I’d recommend reviewing the detailed steps provided in the documentation below to ensure the configuration is aligned end-to-end:
https://learn.microsoft.com/en-us/fabric/security/workspace-identity
Best Regards,
Community Support Team
Hi @malimahesh ,
I hope the below details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
hi @v-menakakota i did go through the suggestions shared, as of now we have dicided to go ahead with key vault approach, but in future we want to explore managed identity way with AD, but i couldnt get details steps, information scattered here and there, so i guess i would have to do some kind of POC and then think of rolling it out to production., so for now i can conclude this thread
What are you hoping to do once you have connected to the SQL Server database, as there may be more practical alternatives?
I am developing archival utility, where i will archiving delta tables as well sql tables dirctly from single notebook, but for doing sql archival i want some passwordles option so there is no need to expose password, as even if i use keyvault, it can be printed at code level, i liked the access token and managed identity option, but for access token gets expired early, and managed identiy need complex setup it seems, so i am weighing my options as of now but i am open for any suggestion as i have few million rows at least for first time in sql
Hi @malimahesh ,
Thank you for the update, I completely understand your need for a secure and passwordless connection, especially for large-scale archival operations across both Delta and SQL tables.
From what you’ve described, using Managed Identity (Workspace Identity) would indeed be the most seamless and secure option once it’s set up correctly, as it avoids credential exposure and token expiry issues. I’d recommend reviewing the detailed steps provided in the documentation below to ensure the configuration is aligned end-to-end:
https://learn.microsoft.com/en-us/fabric/security/workspace-identity
Best Regards,
Community Support Team
Hi @malimahesh ,
I hope the below details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
hi @v-menakakota i did go through the suggestions shared, as of now we have dicided to go ahead with key vault approach, but in future we want to explore managed identity way with AD, but i couldnt get details steps, information scattered here and there, so i guess i would have to do some kind of POC and then think of rolling it out to production., so for now i can conclude this thread
Hi @malimahesh ,
I hope the below details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hi @malimahesh,
You’re running into a few expected gotchas with notebooks + SQL Server auth in Fabric:
Some things to try
(
df.write
.format("sqlserver")
.option("host", "<server>.database.windows.net")
.option("port", "1433")
.option("database", "<db>")
.option("dbtable", "dbo.Target")
.option("authentication", "ActiveDirectoryMSI") # tokens handled per task
.mode("append")
.save()
)jdbcUrl = (
"jdbc:sqlserver://<server>.database.windows.net:1433;"
"database=<db>;"
"encrypt=true;"
"hostNameInCertificate=*.database.windows.net;"
"loginTimeout=30;"
"authentication=ActiveDirectoryMSI"
)
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
spark._sc._jvm.Class.forName(driver)
conn = spark._sc._jvm.java.sql.DriverManager.getConnection(jdbcUrl)
stmt = conn.createStatement()
stmt.executeUpdate("DELETE dbo.Target WHERE LoadDate < '2025-01-01';") # or EXEC dbo.DeleteOldRows @cutoff = ...
stmt.close(); conn.close()
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Proud to be a Super User! | |
Hi @malimahesh ,
Thanks for reaching out to the Microsoft fabric community forum.
Please go through the below documents which may be help you in resolving the issue
Run T-SQL code in Fabric Python notebooks - Microsoft Fabric | Microsoft Learn
Use Python experience on Notebook - Microsoft Fabric | Microsoft Learn
https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities
Connect to Your SQL Database - Microsoft Fabric | Microsoft Learn
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team.
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 25 | |
| 16 | |
| 11 | |
| 10 | |
| 6 |
| User | Count |
|---|---|
| 79 | |
| 68 | |
| 56 | |
| 24 | |
| 22 |