Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
mrojze
Helper II
Helper II

How to dynamically generate a spark.sql using paramters?

I am new to Spark.

I am trying to insert a row into a Delta table in my lakehouse with parameters sent by a ADF pipeline.

I am having a few issues, but the main one is that I can't figure out how to insert a row into a Delta table, using a dynamic SQL.

 

mrojze_0-1694185682334.png

 

I'm not sure how to use the parameters either....

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi   @mrojze ,
To insert a row into a Delta table using dynamic SQL, you can use the spark.sql method in Scala or Python. Here's an example in Scala: 

val myTable = "myDeltaTable"
val column1 = "value1"
val column2 = "value2"
val sqlQuery = s"INSERT INTO $myTable VALUES ('$column1', '$column2')"
spark.sql(sqlQuery)

In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use string interpolation to dynamically generate an SQL query that inserts a row into the Delta table. Finally, we pass the SQL query to the spark.sql method to execute it.

 

 Here's an example in Python:

 

myTable = "myDeltaTable"
column1 = "value1"
column2 = "value2"
sqlQuery = "INSERT INTO {} VALUES ('{}', '{}')".format(myTable, column1, column2)
spark.sql(sqlQuery)


In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use placeholders {} in the SQL query and pass the parameter values as arguments to the format method. Finally, we pass the SQL query to the spark.sql method to execute it.

 

 

I have tried inserting data into my table named test_table_3 using pyspark commands. You can refer to the screenshots below.

 

 667efdd4-8075-4259-8224-ff6255d193b3.jpg

7796f8c9-3afe-4025-ab0b-404acf8b35c9.jpg

40a3670c-a5ae-46ed-8e0f-693298337583.jpg

 

Hope this helps. Please let us know if you have any further issues.
Thank you.

 

View solution in original post

2 REPLIES 2
mrojze
Helper II
Helper II

Thanks! This is so helpful.

I ended up creating a dataframe and applying a merge. Your solution works great, but I still need to learn the basics of pyspark. If I want to do a conditional update I need to learn how to do an IF statement 🙂

Because your solution doesn't work with a MERGE, does it?

 

mrojze_0-1694620406613.png

 
Anonymous
Not applicable

Hi   @mrojze ,
To insert a row into a Delta table using dynamic SQL, you can use the spark.sql method in Scala or Python. Here's an example in Scala: 

val myTable = "myDeltaTable"
val column1 = "value1"
val column2 = "value2"
val sqlQuery = s"INSERT INTO $myTable VALUES ('$column1', '$column2')"
spark.sql(sqlQuery)

In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use string interpolation to dynamically generate an SQL query that inserts a row into the Delta table. Finally, we pass the SQL query to the spark.sql method to execute it.

 

 Here's an example in Python:

 

myTable = "myDeltaTable"
column1 = "value1"
column2 = "value2"
sqlQuery = "INSERT INTO {} VALUES ('{}', '{}')".format(myTable, column1, column2)
spark.sql(sqlQuery)


In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use placeholders {} in the SQL query and pass the parameter values as arguments to the format method. Finally, we pass the SQL query to the spark.sql method to execute it.

 

 

I have tried inserting data into my table named test_table_3 using pyspark commands. You can refer to the screenshots below.

 

 667efdd4-8075-4259-8224-ff6255d193b3.jpg

7796f8c9-3afe-4025-ab0b-404acf8b35c9.jpg

40a3670c-a5ae-46ed-8e0f-693298337583.jpg

 

Hope this helps. Please let us know if you have any further issues.
Thank you.

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.