Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
dbeavon3
Memorable Member
Memorable Member

Using "Save As Table" Without having a default lakehouse

Changing the default lakehouse in a notebook doesn't seem possible. 
.. It requires the re-init of the pyspark session.

(This seems like a frustrating issue that lots of other users have discussed in the past, so I won't revisit that.)

 

What I'm trying to do is avoid the need for a "default lakehouse".  In the code below I'm saving a parquet file to the "Files" storage in a lakehouse, and trying to register a "Table"  for it as well, in the "Tables" list.  However this doesn't appear to be possible  It ignores my request to register the table.  I find that the parquet file is stored properly (in "Files") but the "Table" operation is ignored unless I run the notebook with a default lakehouse configured: 

 

dbeavon3_0-1753712409777.png

 

Can someone tell me if there is a way to get this to work?  Why does the "saveAsTable" not allow me to specify the abfss location of the lakehouse?  That would obviate the need to set a "default lakehouse"?

Is there another mechanism to register an "external table" with a more direct API that is more native to this Fabric environment.  I think the problem is that "saveAsTable" is an API interface defined in Apache Spark and Microsoft is trying to implement it in a way that is true to the spirit of Spark.  But perhaps there is a totally different API to simply tell the lakehouse about my parquet file, and get it to register the table under "Tables".  This really shouldn't be so hard.

1 ACCEPTED SOLUTION
apturlov
Advocate I
Advocate I

@dbeavon3 In Fabric architecture, a notebook must be bound to a Lakehouse, therefore there is a default lakehouse attached to a notebook. As you correctly noticed, .saveAsTable() does not allow you to specify an abss path to the table on OneLake because it works with managed tables that are bound to a default lakehouse. If you have another lakehouse in your workspace or in a differrent workspace you can still save your data in a Delta table format but you'll need to use .save() wtih abss path which will create an external table in a target lakehouse.

In the code example below you can see that I have two lakehouses in a workspace, the Lakehouselab set as a default for the notebook. However I can succeefully save a dataframe into another non-default lakehouse with this technique.

apturlov_0-1753919660541.png

Unfortunately, you won't be able to use this external table in a lakehouse because it's currently not supported in Fabric (see this Solved: Re: External Tables in Fabric - Microsoft Fabric Community), so essentially, this entire exercise is useless. In a nutshell, external delta tables are not needed in Fabric because they can be replaced wiht a shortcut to an external data source.

However, your original question was about default lakehouse in a notebook, and the answer is yes, default lakehouse is absolutely necessary in Fabric. The default lakehouse can be reconfigured in a deployment pipeline when a notebook is moved to another workspace.

View solution in original post

5 REPLIES 5
apturlov
Advocate I
Advocate I

@dbeavon3 In Fabric architecture, a notebook must be bound to a Lakehouse, therefore there is a default lakehouse attached to a notebook. As you correctly noticed, .saveAsTable() does not allow you to specify an abss path to the table on OneLake because it works with managed tables that are bound to a default lakehouse. If you have another lakehouse in your workspace or in a differrent workspace you can still save your data in a Delta table format but you'll need to use .save() wtih abss path which will create an external table in a target lakehouse.

In the code example below you can see that I have two lakehouses in a workspace, the Lakehouselab set as a default for the notebook. However I can succeefully save a dataframe into another non-default lakehouse with this technique.

apturlov_0-1753919660541.png

Unfortunately, you won't be able to use this external table in a lakehouse because it's currently not supported in Fabric (see this Solved: Re: External Tables in Fabric - Microsoft Fabric Community), so essentially, this entire exercise is useless. In a nutshell, external delta tables are not needed in Fabric because they can be replaced wiht a shortcut to an external data source.

However, your original question was about default lakehouse in a notebook, and the answer is yes, default lakehouse is absolutely necessary in Fabric. The default lakehouse can be reconfigured in a deployment pipeline when a notebook is moved to another workspace.

I'll check out the shortcuts.  Thanks for the pointer.

Hi @dbeavon3 

Thank you for reaching out to the Microsoft Fabric Forum Community.

@apturlov  Thanks for the inputs.

I hope the information provided by user was helpful. If you still have questions, please don't hesitate to reach out to the community. 
Thank you for your understanding.

Hi @dbeavon3 

I wanted to check if you had the opportunity to review the information provided by user. Please feel free to contact us if you have any further questions.

dbeavon3
Memorable Member
Memorable Member

Here is the type of error we see from a notebook, when a default lakehouse isn't specified:

dbeavon3_0-1753718230643.png

 

AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.)

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors