Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Changing the default lakehouse in a notebook doesn't seem possible.
.. It requires the re-init of the pyspark session.
(This seems like a frustrating issue that lots of other users have discussed in the past, so I won't revisit that.)
What I'm trying to do is avoid the need for a "default lakehouse". In the code below I'm saving a parquet file to the "Files" storage in a lakehouse, and trying to register a "Table" for it as well, in the "Tables" list. However this doesn't appear to be possible It ignores my request to register the table. I find that the parquet file is stored properly (in "Files") but the "Table" operation is ignored unless I run the notebook with a default lakehouse configured:
Can someone tell me if there is a way to get this to work? Why does the "saveAsTable" not allow me to specify the abfss location of the lakehouse? That would obviate the need to set a "default lakehouse"?
Is there another mechanism to register an "external table" with a more direct API that is more native to this Fabric environment. I think the problem is that "saveAsTable" is an API interface defined in Apache Spark and Microsoft is trying to implement it in a way that is true to the spirit of Spark. But perhaps there is a totally different API to simply tell the lakehouse about my parquet file, and get it to register the table under "Tables". This really shouldn't be so hard.
Solved! Go to Solution.
@dbeavon3 In Fabric architecture, a notebook must be bound to a Lakehouse, therefore there is a default lakehouse attached to a notebook. As you correctly noticed, .saveAsTable() does not allow you to specify an abss path to the table on OneLake because it works with managed tables that are bound to a default lakehouse. If you have another lakehouse in your workspace or in a differrent workspace you can still save your data in a Delta table format but you'll need to use .save() wtih abss path which will create an external table in a target lakehouse.
In the code example below you can see that I have two lakehouses in a workspace, the Lakehouselab set as a default for the notebook. However I can succeefully save a dataframe into another non-default lakehouse with this technique.
Unfortunately, you won't be able to use this external table in a lakehouse because it's currently not supported in Fabric (see this Solved: Re: External Tables in Fabric - Microsoft Fabric Community), so essentially, this entire exercise is useless. In a nutshell, external delta tables are not needed in Fabric because they can be replaced wiht a shortcut to an external data source.
However, your original question was about default lakehouse in a notebook, and the answer is yes, default lakehouse is absolutely necessary in Fabric. The default lakehouse can be reconfigured in a deployment pipeline when a notebook is moved to another workspace.
@dbeavon3 In Fabric architecture, a notebook must be bound to a Lakehouse, therefore there is a default lakehouse attached to a notebook. As you correctly noticed, .saveAsTable() does not allow you to specify an abss path to the table on OneLake because it works with managed tables that are bound to a default lakehouse. If you have another lakehouse in your workspace or in a differrent workspace you can still save your data in a Delta table format but you'll need to use .save() wtih abss path which will create an external table in a target lakehouse.
In the code example below you can see that I have two lakehouses in a workspace, the Lakehouselab set as a default for the notebook. However I can succeefully save a dataframe into another non-default lakehouse with this technique.
Unfortunately, you won't be able to use this external table in a lakehouse because it's currently not supported in Fabric (see this Solved: Re: External Tables in Fabric - Microsoft Fabric Community), so essentially, this entire exercise is useless. In a nutshell, external delta tables are not needed in Fabric because they can be replaced wiht a shortcut to an external data source.
However, your original question was about default lakehouse in a notebook, and the answer is yes, default lakehouse is absolutely necessary in Fabric. The default lakehouse can be reconfigured in a deployment pipeline when a notebook is moved to another workspace.
I'll check out the shortcuts. Thanks for the pointer.
Hi @dbeavon3
I wanted to check if you had the opportunity to review the information provided by user. Please feel free to contact us if you have any further questions.
Here is the type of error we see from a notebook, when a default lakehouse isn't specified:
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.)