Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
tan_thiamhuat
Post Patron
Post Patron

PySpark to run Stored Procedure inside Fabric

I have run the Stored Procedure directly inside SQL Server Management Studio, and there is no error:

EXEC dbo.Upsert_Dim_Channel 'temp_Dim_Channel'

 

However, I need to run the same Stored Procedure inside Fabric with PySpark as below:

I have the connection as below:

jdbc_url = "jdbc:sqlserver://myserver.database.windows.net:1433;database=mydatabase"
connection_properties = {
    "user": "whoami",
    "password": "seeifyouknowme",
    "driver": "com.microsoft.sqlserver.jdbc.SQLServerDriver"
}
upsert_query = "EXEC dbo.Upsert_Dim_Channel 'temp_Dim_Channel'"
spark.read.format("jdbc").options(
                url=jdbc_url,
                query=upsert_query,
                user=connection_properties["user"],
                password=connection_properties["password"],
                driver=connection_properties["driver"] ).load()
Above will try error as below: 
Py4JJavaError: An error occurred while calling o12036.load. : com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'EXEC'.
Any help would be much appreciated.
1 ACCEPTED SOLUTION
v-karpurapud
Community Support
Community Support

Hi @tan_thiamhuat 


Thank you for reaching out to the Microsoft Fabric Community Forum.

 

The error Incorrect syntax near the keyword 'EXEC' occurs because PySpark’s JDBC connector does not directly support executing stored procedures using the EXEC or EXECUTE statement in the query option.
 

The spark.read.jdbc method’s query option is designed for SQL queries that return a result set. When you pass EXEC dbo.Upsert_Dim_Channel 'temp_Dim_Channel', Spark wraps it in a subquery like SELECT * FROM (EXEC ...) internally, which is invalid SQL syntax for SQL Server, leading to the Incorrect syntax near 'EXEC' error.

 

The stored procedure dbo.Upsert_Dim_Channel works well in SQL Server Management Studio (SSMS) because SSMS understands the EXEC syntax. However, in PySpark, the JDBC driver expects a tabular result, which can cause syntax errors for non-query statements like EXEC. Also, the stored procedure might not return a result set, whicPySpark’s spark.read.jdbc expects by default.

 

To execute the stored procedure dbo.Upsert_Dim_Channel in Microsoft Fabric using PySpark, you need to bypass the spark.read.jdbc limitation by using the JDBC driver’s native Java SQL capabilities (via java.sql.DriverManager) to execute the stored procedure directly.

If this response resolves your query, kindly mark it as Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank You!

View solution in original post

2 REPLIES 2
tan_thiamhuat
Post Patron
Post Patron

Thank you very much v-karpurapud. Your solution of using java.sql.DriverManager works exactly to what I need. Now the Stored Procedure is able to run. I provide the code below for whoever may find this useful:

 

# JDBC connection properties
jdbc_url = "jdbc:sqlserver://yourserver.database.windows.net:1433;database=YourDatabase"
jdbc_user = "whoami"
jdbc_password = "WhyUneed2know?"
jdbc_driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"

# Load the driver class
java_class = sc._jvm.java.lang.Class
java_class.forName(jdbc_driver)

# Get a connection using DriverManager
conn = sc._jvm.java.sql.DriverManager.getConnection(jdbc_url, jdbc_user, jdbc_password)

# Prepare the stored procedure call with a single parameter
callable_stmt = conn.prepareCall("{call dbo.Upsert_Dim_Channel(?)}")

# Set the parameter value
callable_stmt.setString(1, "temp_Dim_Channel")

# Execute the stored procedure
callable_stmt.execute()

# Clean up
callable_stmt.close()
conn.close()

 

v-karpurapud
Community Support
Community Support

Hi @tan_thiamhuat 


Thank you for reaching out to the Microsoft Fabric Community Forum.

 

The error Incorrect syntax near the keyword 'EXEC' occurs because PySpark’s JDBC connector does not directly support executing stored procedures using the EXEC or EXECUTE statement in the query option.
 

The spark.read.jdbc method’s query option is designed for SQL queries that return a result set. When you pass EXEC dbo.Upsert_Dim_Channel 'temp_Dim_Channel', Spark wraps it in a subquery like SELECT * FROM (EXEC ...) internally, which is invalid SQL syntax for SQL Server, leading to the Incorrect syntax near 'EXEC' error.

 

The stored procedure dbo.Upsert_Dim_Channel works well in SQL Server Management Studio (SSMS) because SSMS understands the EXEC syntax. However, in PySpark, the JDBC driver expects a tabular result, which can cause syntax errors for non-query statements like EXEC. Also, the stored procedure might not return a result set, whicPySpark’s spark.read.jdbc expects by default.

 

To execute the stored procedure dbo.Upsert_Dim_Channel in Microsoft Fabric using PySpark, you need to bypass the spark.read.jdbc limitation by using the JDBC driver’s native Java SQL capabilities (via java.sql.DriverManager) to execute the stored procedure directly.

If this response resolves your query, kindly mark it as Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank You!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.