Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I am new to Spark.
I am trying to insert a row into a Delta table in my lakehouse with parameters sent by a ADF pipeline.
I am having a few issues, but the main one is that I can't figure out how to insert a row into a Delta table, using a dynamic SQL.
I'm not sure how to use the parameters either....
Solved! Go to Solution.
Hi @mrojze ,
To insert a row into a Delta table using dynamic SQL, you can use the spark.sql method in Scala or Python. Here's an example in Scala:
val myTable = "myDeltaTable"
val column1 = "value1"
val column2 = "value2"
val sqlQuery = s"INSERT INTO $myTable VALUES ('$column1', '$column2')"
spark.sql(sqlQuery)
In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use string interpolation to dynamically generate an SQL query that inserts a row into the Delta table. Finally, we pass the SQL query to the spark.sql method to execute it.
Here's an example in Python:
myTable = "myDeltaTable"
column1 = "value1"
column2 = "value2"
sqlQuery = "INSERT INTO {} VALUES ('{}', '{}')".format(myTable, column1, column2)
spark.sql(sqlQuery)
In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use placeholders {} in the SQL query and pass the parameter values as arguments to the format method. Finally, we pass the SQL query to the spark.sql method to execute it.
I have tried inserting data into my table named test_table_3 using pyspark commands. You can refer to the screenshots below.
Hope this helps. Please let us know if you have any further issues.
Thank you.
Thanks! This is so helpful.
I ended up creating a dataframe and applying a merge. Your solution works great, but I still need to learn the basics of pyspark. If I want to do a conditional update I need to learn how to do an IF statement 🙂
Because your solution doesn't work with a MERGE, does it?
Hi @mrojze ,
To insert a row into a Delta table using dynamic SQL, you can use the spark.sql method in Scala or Python. Here's an example in Scala:
val myTable = "myDeltaTable"
val column1 = "value1"
val column2 = "value2"
val sqlQuery = s"INSERT INTO $myTable VALUES ('$column1', '$column2')"
spark.sql(sqlQuery)
In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use string interpolation to dynamically generate an SQL query that inserts a row into the Delta table. Finally, we pass the SQL query to the spark.sql method to execute it.
Here's an example in Python:
myTable = "myDeltaTable"
column1 = "value1"
column2 = "value2"
sqlQuery = "INSERT INTO {} VALUES ('{}', '{}')".format(myTable, column1, column2)
spark.sql(sqlQuery)
In this example, we define the name of the Delta table myTable and the values for the two columns column1 and column2. We then use placeholders {} in the SQL query and pass the parameter values as arguments to the format method. Finally, we pass the SQL query to the spark.sql method to execute it.
I have tried inserting data into my table named test_table_3 using pyspark commands. You can refer to the screenshots below.
Hope this helps. Please let us know if you have any further issues.
Thank you.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.