Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
All of a sudden as of almost 2 weeks ago our spark sql notebooks have been failing due to case sensitivity in the table and field names. This has always worked fine in Synapse before, so I'm not sure if Fabric configs were changed or not.
I'm getting errors like Table or view not found: customers, when I'm running a spark sql query on the customers table
Solved! Go to Solution.
Hi @12angrymentiger
Apologies for the issue you have been facing. At this time, we are reaching out to the internal team to get some help on this. We will update you once we hear back from them.
A simple workaround to fix this issue is to run the below piece of code before executing the main code.
spark.conf.set('spark.sql.caseSensitive', False)
Hope this helps. Please let me know if you have any further questions. Glad to help.
Hi. I am experiencing a similar issue when running an SQL query in an SQL endpoint on some tables I have in a lakehouse within Microsoft Fabric. to my knowledge SQL isnt a case sensetive languaje so not sure why im having issues on the Fabric endpoint yet my queries run fine on apps such as SSMS.
Can anyone help??
@v-nikhilan-msft I wanted to check with you regarding this issue.
Please let me know if you have heard of any fixes for this.
Thanks
Hi @12angrymentiger
Thanks for using Fabric Community. Apologies for the issue you have been facing.
Can you please confirm which Spark Version are you using?
I have confirmed from the internal team that there were no changes done to the case sensitivity. Did you set any spark configurations? If yes can you please give the details about it.
Thanks.
Hi @12angrymentiger ,
I tried to repro the scenario, but did not get any error. I have attached the screenshot.
Please try to check the Spark version which you ar using.
Make sure you use the Spark 1.2 version.
Hope this helps. Please let me know if you have any further queries.
By the way this is still in My Workspace and under the Fabric Trial, but I did change the workspace to use Runtime 1.2 and I'm still getting the error table or view not found.
Hi @12angrymentiger
Apologies for the issue you have been facing. At this time, we are reaching out to the internal team to get some help on this. We will update you once we hear back from them.
A simple workaround to fix this issue is to run the below piece of code before executing the main code.
spark.conf.set('spark.sql.caseSensitive', False)
Hope this helps. Please let me know if you have any further questions. Glad to help.
We figured out when you call another notebook and start using spark sql further down in the main notebook the spark sql caseSensitive is True.
We have moved this code to run after the 2nd notebook call to make sure the main notebook spark sql is caseInsensitive
spark.conf.set('spark.sql.caseSensitive', False)