Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
When creating a new lakehouse, check "lakehouse schemas" option
create a new pyspark notebook with the lakehouse
from the notebook, how to access a delta table in another lakehouse (both lakehouses in same workspace)?
the following code used to work fine, but when the default lakehouse has "schemas" preview feature, it no longer works
df = spark.table("another_lakehouse_name.table_name")
Solved! Go to Solution.
Found out the cause - another lakehouse schemas feature should also be turned on
I did some testing.
My final simplified conclusion is: when in Notebook connected to multiple lakehouses (some with some without schema), make the DEFAULT LH one with schema.
Then I can refer to all others using WSNAME.LHNAME.SCHEMA.TABLE.
Note that LHs without schema use dbo as kind of a placeholder for their schema but still seem to need it atm.
Also setting DEFAULT LH to one without schema causes all the troubles.
Found out the cause - another lakehouse schemas feature should also be turned on
Hello @nilendraFabric
df = spark.table("your_workspace_name.another_lakehouse_name.schema_name.table_name")
got the error AnalysisException: [REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace
hello @yongshao
To access Delta tables across lakehouses with the schemas feature enabled in Fabric notebooks, you must use fully qualified namespace references.
df = spark.table("your_workspace_name.another_lakehouse_name.schema_name.table_name")
If this helps please accept the answer