Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I created a Lakehouse table using a dataflow and it retains the camel casing I used for the table and column names but if I try to do that myself in a notebook using a CREATE TABLE statement, it just makes it all lowercase. Why is that?
Solved! Go to Solution.
Hi @aarongiust , @govindarajan_d
Spark is not case sensitive by default.
There is a way to handle this issue by adding spark config , using a SparkSession object named spark:
spark.conf.set('spark.sql.caseSensitive', True)
By default it is False.
Hope this is helpful. Please let me know incase of further queries.
Hi @aarongiust , @govindarajan_d
Spark is not case sensitive by default.
There is a way to handle this issue by adding spark config , using a SparkSession object named spark:
spark.conf.set('spark.sql.caseSensitive', True)
By default it is False.
Hope this is helpful. Please let me know incase of further queries.
Hi @aarongiust,
I tried it but it creates Sentence case for me:
Any other specific config that you are passing?