Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
I have a NB that is attached to a lakehouse and I am executing the following which works fine
%%pyspark
from delta.tables import *
tbl_name = "fact_sales"
tbl_path = "Tables/"+tbl_name
delta_table = DeltaTable.forPath(spark, tbl_path)
When I am making Save as Copy and the try to run the copied NB, it fails.
It is showing
AnalysisException: Tables/fact_sales is not a Delta table; which is absolutely false.
Why is it failing on copied notebook (the copy still shows the NB is attached to the same LH as the original) where it perfectly executes on the original
Solved! Go to Solution.
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
HI @smpa01,
How did you configure the default lakehouse and environment setting? Can you please share some more detail about your operations?
I copy your code and try to use 'save as copy' feature to create notebook, the two notebook all works well. (I pin the default lakehouse and confirm the used delta table is exist in Tables level)
Regards,
Xiaoxin Sheng
User | Count |
---|---|
4 | |
4 | |
2 | |
2 | |
2 |
User | Count |
---|---|
10 | |
8 | |
7 | |
6 | |
6 |