Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
When creating a new lakehouse, check "lakehouse schemas" option
create a new pyspark notebook with the lakehouse
from the notebook, how to access a delta table in another lakehouse (both lakehouses in same workspace)?
the following code used to work fine, but when the default lakehouse has "schemas" preview feature, it no longer works
df = spark.table("another_lakehouse_name.table_name")
Solved! Go to Solution.
Found out the cause - another lakehouse schemas feature should also be turned on
I did some testing.
My final simplified conclusion is: when in Notebook connected to multiple lakehouses (some with some without schema), make the DEFAULT LH one with schema.
Then I can refer to all others using WSNAME.LHNAME.SCHEMA.TABLE.
Note that LHs without schema use dbo as kind of a placeholder for their schema but still seem to need it atm.
Also setting DEFAULT LH to one without schema causes all the troubles.
Found out the cause - another lakehouse schemas feature should also be turned on
Hello @nilendraFabric
df = spark.table("your_workspace_name.another_lakehouse_name.schema_name.table_name")
got the error AnalysisException: [REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace
hello @yongshao
To access Delta tables across lakehouses with the schemas feature enabled in Fabric notebooks, you must use fully qualified namespace references.
df = spark.table("your_workspace_name.another_lakehouse_name.schema_name.table_name")
If this helps please accept the answer