Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello
I am trying to execute the following code in Fabric notebook (Pyspark cell):
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
('schema1', 'table1', "SELECT col1, col2, col3,col4,col5,col6, col5 || ''_'' || col6 as con_col FROM schema1.table1"))
spark.sql(sql_string)
but i am getting an error around con_col. I tried the following but non of them is working single quote, two single quote, three single quote, 1 double quote but non of them is working,
any suggestions or workarounds are highly appreciated.
Thanks
Solved! Go to Solution.
Try either of this
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', "SELECT col1, col2, col3, col4, col5, col6, col5 || '_' || col6 as con_col FROM schema1.table1")"""
spark.sql(sql_string)
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', 'SELECT col1, col2, col3, col4, col5, col6, col5 || ''_'' || col6 as con_col FROM schema1.table1')"""
spark.sql(sql_string)
Hi @mkjit256 ,
Thanks for the reply from SachinNandanwar .
Please try this code, it works fine for me:
Replace the schema name, table name, and column name with your own.
sql_string = “””
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('dbo', 'products', “SELECT ProductID, ProductName, Category, ListPrice, Date, Month, Date || ‘_’) | | Month as con_col FROM dbo.products")
“"”
spark.sql(sql_string)
The values keyword is used instead of inserting values directly.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @mkjit256 ,
Thanks for the reply from SachinNandanwar .
Please try this code, it works fine for me:
Replace the schema name, table name, and column name with your own.
sql_string = “””
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('dbo', 'products', “SELECT ProductID, ProductName, Category, ListPrice, Date, Month, Date || ‘_’) | | Month as con_col FROM dbo.products")
“"”
spark.sql(sql_string)
The values keyword is used instead of inserting values directly.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Try either of this
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', "SELECT col1, col2, col3, col4, col5, col6, col5 || '_' || col6 as con_col FROM schema1.table1")"""
spark.sql(sql_string)
sql_string = """
INSERT INTO dummy_Table (Schema_Name, Table_Name, SQL_Statement)
VALUES ('schema1', 'table1', 'SELECT col1, col2, col3, col4, col5, col6, col5 || ''_'' || col6 as con_col FROM schema1.table1')"""
spark.sql(sql_string)