The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a NB that is attached to a lakehouse and I am executing the following which works fine
%%pyspark
from delta.tables import *
tbl_name = "fact_sales"
tbl_path = "Tables/"+tbl_name
delta_table = DeltaTable.forPath(spark, tbl_path)
When I am making Save as Copy and the try to run the copied NB, it fails.
It is showing
AnalysisException: Tables/fact_sales is not a Delta table; which is absolutely false.
Why is it failing on copied notebook (the copy still shows the NB is attached to the same LH as the original) where it perfectly executes on the original
Solved! Go to Solution.
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
HI @smpa01,
How did you configure the default lakehouse and environment setting? Can you please share some more detail about your operations?
I copy your code and try to use 'save as copy' feature to create notebook, the two notebook all works well. (I pin the default lakehouse and confirm the used delta table is exist in Tables level)
Regards,
Xiaoxin Sheng