Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
How to write data into synapse dedicated sql pool from fabric notebook.
I want to save the spark dataframe from Fabric Notebook into Azure Synapse dedicated sql pool.
Is it doable, if yes, can anyone please share the code, along with any configuration settings if needed at the dedicated sql pool level.
Hi @SriThiru ,
I see your question is remained as unanswered for a long time.
Yes, you can definitely write data from a Spark DataFrame in a Fabric Notebook to an Azure Synapse dedicated SQL pool.
You’ll just need to use the .write method with the jdbc format. Here's a simple example to get you started:
df.write \
.format("jdbc") \
.option("url", "jdbc:sqlserver://<your-server-name>.sql.azuresynapse.net:1433;database=<your-database-name>") \
.option("dbtable", "<your-schema>.<your-table-name>") \
.option("user", "<your-username>") \
.option("password", "<your-password>") \
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver") \
.mode("append") \
.save()Make sure your dedicated SQL pool allows Azure services to connect (you can set this in the firewall settings).
The table you're writing to should already exist in Synapse with the right schema.
If you're using managed identity instead of SQL auth, you’ll need to handle authentication a bit differently (happy to help with that if needed).
Regards,
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 9 | |
| 5 | |
| 5 | |
| 4 | |
| 4 |
| User | Count |
|---|---|
| 29 | |
| 16 | |
| 10 | |
| 9 | |
| 8 |