Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
dbeavon3
Memorable Member
Memorable Member

Cannot use spark.catalog.listTables in notebook

I have errors when calling spark.catalog.listTables() in fabic, with a default lakehouse connected.

 

The error is like so:

 

[PARSE_SYNTAX_ERROR] Syntax error at or near end of input.(line 1, pos 0)

== SQL ==

^^^

runtime 1.3
spark 3.5
delta 3.2

 

I did not enroll in an preview features as far as I'm aware.  Here is the full screenshot.

 

dbeavon3_1-1754078513187.png

 

 

According to the docs this stuff is supposed to be GA. 
Why is there such a basic and incoherent error when listing tables???

 

 

dbeavon3_2-1754078603637.png

 

https://learn.microsoft.com/en-us/fabric/data-engineering/runtime

 

 

 

7 REPLIES 7
v-pnaroju-msft
Community Support
Community Support

Hi dbeavon3,

We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Hi dbeavon3,

We would like to follow up and see whether the details we shared have resolved your problem.
If you need any more assistance, please feel free to connect with the Microsoft Fabric community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Thankyou, @BhaveshPatel, for your response.

Hi dbeavon3,

We appreciate your inquiry submitted through the Microsoft Fabric Community Forum.

As per my understanding, the error “[PARSE_SYNTAX_ERROR] Syntax error at or near end of input” may occur if the default Lakehouse is not attached, if the Lakehouse is schema enabled which is a known limitation with certain Spark catalog APIs or due to the parsing behavior of Python 3.11 and Delta 3.2 generating invalid SQL for spark.catalog.listTables().

Kindly follow the steps below to resolve the issue:

  1. Check the default database using spark.catalog.currentDatabase(). If this returns an error or shows default with no tables, you will need to specify your Lakehouse name explicitly.
  2. List the available databases using spark.catalog.listDatabases(). Confirm that your Lakehouse name is present in the list.
  3. Explicitly list tables from the Lakehouse by using spark.catalog.listTables("YourLakehouseName"). Please replace YourLakehouseName with the actual name of your Lakehouse.
  4. Ensure that the notebook has the correct default Lakehouse attached.

Please find the attached screenshot regarding the schema enabled Lakehouse limitation:

vpnarojumsft_0-1754419641361.png

Additionally, kindly refer to the following link for more information:
Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn

We hope that the above information will assist in resolving the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric Community.

Thank you.

BhaveshPatel
Community Champion
Community Champion

This is still visible in my spark environment. There is a permissions issue.

 

use pyspark and notebooks altogether.

BhaveshPatel_0-1754294080925.png

 

 

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
BhaveshPatel
Community Champion
Community Champion

There is definitely permissions issue in your spark environment. I can see the tables in spark.catalog.listTables(). It was available in Delta Lake 2.2.0 and it is part of data lake of spark.

BhaveshPatel_0-1754213353697.png

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

The error message does not say anything about permissions.  It says :

PARSE_SYNTAX_ERROR

 

 

This bug was reported before in the community.  The last time it was reported, the users were told to upgrade to a GA version of delta.  I'm on GA for all versions.

 

If you have a command to prove or disprove your theory about permissions, please let me know.  There are no clue's on my end to support that theory.  I can use my default lakehouse fine, and can run almost every type of spark operation except I get failures on certain "catalog" operations from pyspark.

 

 

This may be a problem in the delta project.  I found the following:

https://github.com/delta-io/delta/issues/2610

 

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.