Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
nrogers
New Member

Spark SQL can't read lakehouse tables in a schema with an uppercase name

We created a lakehouse with schemas enabled and noticed that we were getting exceptions in a Notebook while trying to read from some schemas but not others.  All of the schemas worked normally elsewhere, and the notebook could read from any schema if we used spark.read instead of spark.sql.

 

The pattern seems to be that only schemas with lower case names work with spark.sql.  I verified this by creating two schema shortcurts to the same schema with the only difference being that one shortcut was named in all uppercase and the other was named in all lowercase.  The former gave exceptions from spark.sql while the latter worked normally.

 

This is what the exceptions look like, using the code snippet generated by dragging a table from the left pane into the notebook.

AnalysisException                         Traceback (most recent call last)
Cell In[8], line 1
----> 1 df = spark.sql('SELECT * FROM Faculty_Success.ODS_UC.TTSV_ACAD_PERIOD_CURRENT_LOV LIMIT 1000')
      2 display(df)

File /opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py:1631, in SparkSession.sql(self, sqlQuery, args, **kwargs)
   1627         assert self._jvm is not None
   1628         litArgs = self._jvm.PythonUtils.toArray(
   1629             [_to_java_column(lit(v)) for v in (args or [])]
   1630         )
-> 1631     return DataFrame(self._jsparkSession.sql(sqlQuery, litArgs), self)
   1632 finally:
   1633     if len(kwargs) > 0:

File ~/cluster-env/trident_env/lib/python3.11/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args)
   1316 command = proto.CALL_COMMAND_NAME +\
   1317     self.command_header +\
   1318     args_command +\
   1319     proto.END_COMMAND_PART
   1321 answer = self.gateway_client.send_command(command)
-> 1322 return_value = get_return_value(
   1323     answer, self.gateway_client, self.target_id, self.name)
   1325 for temp_arg in temp_args:
   1326     if hasattr(temp_arg, "_detach"):

File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:185, in capture_sql_exception.<locals>.deco(*a, **kw)
    181 converted = convert_exception(e.java_exception)
    182 if not isinstance(converted, UnknownException):
    183     # Hide where the exception came from that shows a non-Pythonic
    184     # JVM exception message.
--> 185     raise converted from None
    186 else:
    187     raise

AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `Faculty_Success`.`ODS_UC`.`TTSV_ACAD_PERIOD_CURRENT_LOV` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.; line 1 pos 14;
'GlobalLimit 1000
+- 'LocalLimit 1000
   +- 'Project [*]
      +- 'UnresolvedRelation [Faculty_Success, ODS_UC, TTSV_ACAD_PERIOD_CURRENT_LOV], [], false

 

1 ACCEPTED SOLUTION
logmil
Regular Visitor

Put this at the top of the notebook

 

spark.conf.set("spark.sql.caseSensitive", "true")

View solution in original post

4 REPLIES 4
Kesahli
Frequent Visitor

I know its been a while but have just come across this same issue, and logmil's suggestion of adding the below statement into the notebook worked. 

 

spark.conf.set("spark.sql.caseSensitive", "true")

Thank you for the feedback! Thank @logmil for providing the suggestion! 

 

logmil
Regular Visitor

Put this at the top of the notebook

 

spark.conf.set("spark.sql.caseSensitive", "true")

v-jingzhan-msft
Community Support
Community Support

Hi  @nrogers  

 

I couldn't reproduce the same result. Uppercase schema name works well in my testing. I tested with new schema, new schema shortcut and new lakehouse, all worked well. 

 vjingzhanmsft_0-1730440624117.png

What's your service version and data region?

My version is 13.0.24611.34 and data region is UK South. Sometimes this could be an issue that occurs in some specific regions or versions. If there is a fix, it may take some time to be deployed to all regions.

 

Best Regards,
Jing
Community Support Team

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.