Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I have a NB that is attached to a lakehouse and I am executing the following which works fine
%%pyspark
from delta.tables import *
tbl_name = "fact_sales"
tbl_path = "Tables/"+tbl_name
delta_table = DeltaTable.forPath(spark, tbl_path)
When I am making Save as Copy and the try to run the copied NB, it fails.
It is showing
AnalysisException: Tables/fact_sales is not a Delta table; which is absolutely false.
Why is it failing on copied notebook (the copy still shows the NB is attached to the same LH as the original) where it perfectly executes on the original
Solved! Go to Solution.
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
HI @smpa01,
How did you configure the default lakehouse and environment setting? Can you please share some more detail about your operations?
I copy your code and try to use 'save as copy' feature to create notebook, the two notebook all works well. (I pin the default lakehouse and confirm the used delta table is exist in Tables level)
Regards,
Xiaoxin Sheng
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
| User | Count |
|---|---|
| 16 | |
| 8 | |
| 2 | |
| 2 | |
| 2 |
| User | Count |
|---|---|
| 31 | |
| 14 | |
| 5 | |
| 5 | |
| 3 |