Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Why does the Lakehouse Explorer display table names in all lowercase, while the SQL Analytics Endpoint Explorer shows them in the intended letter case? Whenever I create notebooks that use the spark.catalog function, the table names are returned in lowercase instead of their original mixed-case format. How can I fix this? Any assistance would be greatly appreciated. Thank you in advance.
Solved! Go to Solution.
Hello @Jher12
I replicated your scenario, I'm also gettign the same. It's due to Spark internally normalizes the tables names to lower case unless you used it in quotes explcitly. Bcz spark handles the table identifers internally. I would recommend you to create the tables with quotes and later you can use Spark SQL to run the table.
I believe this behaviour not only in Microsoft Fabric, it's across all the spark enviuronments.
Please let me know if it helps you.
Thank you!!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
Hi @Jher12,
Thank you for reaching out to the Microsoft Fabric Community Forum. Also, thanks to @suparnababu8, for his inputs on this thread.
Has your issue been resolved? If the response provided by the community member @suparnababu8, addressed your query, could you please confirm? It helps us ensure that the solutions provided are effective and beneficial for everyone.
Hope this helps clarify things and let me know what you find after giving these steps a try happy to help you investigate this further.
Thank you for using the Microsoft Community Forum.
Yes
Hello @Jher12
I replicated your scenario, I'm also gettign the same. It's due to Spark internally normalizes the tables names to lower case unless you used it in quotes explcitly. Bcz spark handles the table identifers internally. I would recommend you to create the tables with quotes and later you can use Spark SQL to run the table.
I believe this behaviour not only in Microsoft Fabric, it's across all the spark enviuronments.
Please let me know if it helps you.
Thank you!!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
Hi,
Thank you for the response. Is there any way to rename the tables without dropping and recreating them with quotes?
I don't think there is chance!! But, don;t drop your existing table create dummy table and try it once.
Thank you!!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
Thank you @suparnababu8, your quick response helped guide me to a solution. I was able to resolve the issue by recreating the tables loading them from another data lakehouse while ensuring the table names retained their original casing by using the code below to create the tables.
Code:
df.write.format("delta").mode("overwrite").save(path)
spark.sql(f"CREATE TABLE `{table_name}` USING DELTA LOCATION '{path}'")
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!