Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
Hello
I am trying to execute the following code in Fabric notebook (Pyspark cell):
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
('schema1', 'table1', "SELECT col1, col2, col3,col4,col5,col6, col5 || ''_'' || col6 as con_col FROM schema1.table1"))
spark.sql(sql_string) but i am getting an error around con_col. I tried the following but non of them is working single quote, two single quote, three single quote, 1 double quote but non of them is working,
any suggestions or workarounds are highly appreciated.
Thanks
Solved! Go to Solution.
Try either of this
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', "SELECT col1, col2, col3, col4, col5, col6, col5 || '_' || col6 as con_col FROM schema1.table1")"""
spark.sql(sql_string)
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', 'SELECT col1, col2, col3, col4, col5, col6, col5 || ''_'' || col6 as con_col FROM schema1.table1')"""
spark.sql(sql_string)
Hi @mkjit256 ,
Thanks for the reply from SachinNandanwar .
Please try this code, it works fine for me:
Replace the schema name, table name, and column name with your own.
sql_string = “””
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('dbo', 'products', “SELECT ProductID, ProductName, Category, ListPrice, Date, Month, Date || ‘_’) | | Month as con_col FROM dbo.products")
“"”
spark.sql(sql_string)
The values keyword is used instead of inserting values directly.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @mkjit256 ,
Thanks for the reply from SachinNandanwar .
Please try this code, it works fine for me:
Replace the schema name, table name, and column name with your own.
sql_string = “””
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('dbo', 'products', “SELECT ProductID, ProductName, Category, ListPrice, Date, Month, Date || ‘_’) | | Month as con_col FROM dbo.products")
“"”
spark.sql(sql_string)
The values keyword is used instead of inserting values directly.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Try either of this
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', "SELECT col1, col2, col3, col4, col5, col6, col5 || '_' || col6 as con_col FROM schema1.table1")"""
spark.sql(sql_string)
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', 'SELECT col1, col2, col3, col4, col5, col6, col5 || ''_'' || col6 as con_col FROM schema1.table1')"""
spark.sql(sql_string)