Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
g3kuser
Helper I
Helper I

Notebook default lakehouse error

I have a notebook A to which default lakehouse gets added dynamically. Notebook A alone executed individually works fine and does the necessary steps. Strangely when Notebook A is called from Notebook Main using notebook.runmultiple returns error complaining about default lakehouse.

 

org.apache.hadoop.hive.ql.metadata.HiveException: MetaException message:Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.

 

As Notebook Main doesn't have any default lakehouse set up as its purpose is only to create multiple invocations using runMultiple dag.

 

Has anyone faced a similar issue and any possible solutions?

1 ACCEPTED SOLUTION
v-ssriganesh
Community Support
Community Support

Hi @g3kuser,
Thanks for posting your query in Microsoft fabric community forum.

The issue you're facing with the default Lakehouse when using notebook.runMultiple(). The error occurs because Notebook Main does not have a Lakehouse attached, and even though it isn't running Spark queries directly, it still manages the execution environment for child notebooks.

Try the following steps:

  • Open Notebook Main. Attach a Lakehouse (either the same one as Notebook A or a temporary one). Save Notebook Main with this attached Lakehouse and Execute Notebook Main again.
  • This will ensure that Notebook A inherits the necessary Lakehouse context, preventing the error.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Hope this works for you!
Thank you.

View solution in original post

7 REPLIES 7
v-ssriganesh
Community Support
Community Support

Hi @g3kuser,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.

Yes I got it resolved by adding one more notebook that adds default lakehouse for Notebook Main when Notebook A is execution is coming through the queue with the necessary lakehouse to be attached as parameter. The extra notebook is unnecessary but that is what helps solve the issue.

Hi @g3kuser,
We appreciate your efforts and are pleased to hear that your issue was resolved. Please mark the helpful response and accept it as the solution. This will assist other community members in resolving similar issues more efficiently.
Thank you.

v-ssriganesh
Community Support
Community Support

Hi @g3kuser,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

v-ssriganesh
Community Support
Community Support

Hi @g3kuser,
Thanks for posting your query in Microsoft fabric community forum.

The issue you're facing with the default Lakehouse when using notebook.runMultiple(). The error occurs because Notebook Main does not have a Lakehouse attached, and even though it isn't running Spark queries directly, it still manages the execution environment for child notebooks.

Try the following steps:

  • Open Notebook Main. Attach a Lakehouse (either the same one as Notebook A or a temporary one). Save Notebook Main with this attached Lakehouse and Execute Notebook Main again.
  • This will ensure that Notebook A inherits the necessary Lakehouse context, preventing the error.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Hope this works for you!
Thank you.

nilendraFabric
Super User
Super User

Hello @g3kuser 

When you run Notebook A individually, it works because it dynamically attaches a default Lakehouse. However, when Notebook A is invoked from Notebook Main using notebook.runMultiple, the context of the default Lakehouse is not carried over, causing the error

 

 

Use the %%configure magic command to define the default Lakehouse programmatically at the start of Notebook A.

%%configure -f
{
"defaultLakehouse": {
"name": "<lakehouse_name>",
"id": "<lakehouse_id>",
"workspaceId": "<workspace_id>"
}
}

If this helps please accept the answer

Tried that but still got same error. I did further test and found if I attach the same default lakehouse to Notebook Main it works fine. Though this solved the error this solution doesn't work as Notebook Main is used only to create runmultiple dag and is used to call other notebook which may or may not need the same default lakehouse and can cause some other issues.  

I set the default lakehouse for Notebook A from Notebook Main using sempy update notebook definition. This step occurs before runmultiple executions of Notebook A are invoked from Notebook Main.

Helpful resources

Announcements
August Fabric Update Carousel

Fabric Monthly Update - August 2025

Check out the August 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors