Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
akaur
Regular Visitor

create a new deployment rule to update default lakehouse of all Notebooks

I see we can use Fabric REST APIs to PATCH notebooks in Fabric  but it only updates the notebook descriptions and names. I want to update the default lakehouse of each NB when deploying from DEV to UAT. I want to do it programatically. 

 

https://learn.microsoft.com/en-us/rest/api/fabric/notebook/items/update-notebook?tabs=HTTP

1 ACCEPTED SOLUTION
nilendraFabric
Community Champion
Community Champion

Hello @akaur 

 

there is no direct rest api endpoint to set the default lakehouse as of now

 

but to do it programmatically , please give this a try

 

from notebookutils import mssparkutils

mssparkutils.notebook.updateDefinition(
name="<notebook name>",
workspaceId="<workspace id of the notebook>",
defaultLakehouse="<lakehouse name of the new lakehouse>",
defaultLakehouseWorkspace="<workspace id of new lakehouse to be added>"
)

 

The `updateDefinition` function is part of the newer `notebookutils` library, which is the updated version of `mssparkutils`

 

please accept this solution if this is helpful 

 

thanks

View solution in original post

8 REPLIES 8
wbeke
Frequent Visitor

Can somebody explain to my why the code from @nilendraFabric is giving me this error:

Py4JJavaError                             Traceback (most recent call last)
Cell In[20], line 10
      8 print(f"ws : {ws_id}" )
      9 print(f"nb : {current_notebook_name}" )
---> 10 mssparkutils.notebook.notebook.updateDefinition(
     11 name=current_notebook_name,
     12 workspaceId=ws_id,
     13 defaultLakehouse="lh_d_test_for_rest_api",
     14 defaultLakehouseWorkspace=ws_id
     15 )

File ~/cluster-env/trident_env/lib/python3.11/site-packages/notebookutils/notebook.py:67, in updateDefinition(name, content, defaultLakehouse, defaultLakehouseWorkspace, workspaceId, environmentId, environmentWorkspaceId)
     60 def updateDefinition(name,
     61                      content=None,
     62                      defaultLakehouse="",
   (...)
     65                      environmentId="",
     66                      environmentWorkspaceId=""):
---> 67     return nb.updateDefinition(name, content,
     68                                defaultLakehouse,
     69                                defaultLakehouseWorkspace,
     70                                workspaceId,
     71                                environmentId,
     72                                environmentWorkspaceId)

File ~/cluster-env/trident_env/lib/python3.11/site-packages/notebookutils/mssparkutils/handlers/notebookHandler.py:253, in SynapseNotebookHandler.updateDefinition(self, name, content, defaultLakehouse, defaultLakehouseWorkspace, workspaceId, environmentId, environmentWorkspaceId)
    251 defaultLakehouseWorkspace = self._getWorkspaceId(defaultLakehouseWorkspace)
    252 environmentWorkspaceId = self._getWorkspaceId(environmentWorkspaceId)
--> 253 return self.notebookutils.updateDefinition(name, content, defaultLakehouse,
    254                                            defaultLakehouseWorkspace, workspaceId,
    255                                            environmentId, environmentWorkspaceId)

File ~/cluster-env/trident_env/lib/python3.11/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args)
   1316 command = proto.CALL_COMMAND_NAME +\
   1317     self.command_header +\
   1318     args_command +\
   1319     proto.END_COMMAND_PART
   1321 answer = self.gateway_client.send_command(command)
-> 1322 return_value = get_return_value(
   1323     answer, self.gateway_client, self.target_id, self.name)
   1325 for temp_arg in temp_args:
   1326     if hasattr(temp_arg, "_detach"):

File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:179, in capture_sql_exception.<locals>.deco(*a, **kw)
    177 def deco(*a: Any, **kw: Any) -> Any:
    178     try:
--> 179         return f(*a, **kw)
    180     except Py4JJavaError as e:
    181         converted = convert_exception(e.java_exception)

File ~/cluster-env/trident_env/lib/python3.11/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))

Py4JJavaError: An error occurred while calling z:notebookutils.notebook.updateDefinition.
: com.microsoft.spark.notebook.msutils.ForbiddenException: Cannot update current artifact definition
	at com.microsoft.spark.notebook.msutils.impl.fabric.MSArtifactUtilsImpl.updateDefinition(MSArtifactUtilsImpl.scala:151)
	at com.microsoft.spark.notebook.msutils.impl.MSNotebookUtilsImpl.updateDefinition(MSNotebookUtilsImpl.scala:768)
	at notebookutils.notebook$.$anonfun$updateDefinition$1(notebook.scala:50)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at com.microsoft.spark.notebook.common.trident.CertifiedTelemetryUtils$.withTelemetry(CertifiedTelemetryUtils.scala:82)
	at notebookutils.notebook$.updateDefinition(notebook.scala:49)
	at notebookutils.notebook.updateDefinition(notebook.scala)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.base/java.lang.Thread.run(Thread.java:829)


My code

from notebookutils import mssparkutils
import sempy.fabric as fabric

current_notebook_name = notebookutils.runtime.context['currentNotebookName']
ws_id = fabric.get_notebook_workspace_id()
print(f"ws : {ws_id}" )
print(f"nb : {current_notebook_name}" )
mssparkutils.notebook.notebook.updateDefinition(
name=current_notebook_name,
workspaceId=ws_id,
defaultLakehouse="lh_d_test_for_rest_api",
defaultLakehouseWorkspace=ws_id
)

Thanks in advance



Looks like you can't update the definition of the running notebook.
I've tried it interactively (as you have above) and from within a pipeline.  I've not tested from within a DAG, but I suspect that'll be the same.
Incidentally, you get a different error message running it from a pipeline;

Py4JJavaError: An error occurred while calling z:notebookutils.notebook.updateDefinition. : com.microsoft.spark.notebook.msutils.ForbiddenException: Cannot update current artifact definition at

I have successfully changed the default lakehouse of another notebook though.

If this helps, please consider Accepting as a Solution so others can find it more easily.

v-shamiliv
Community Support
Community Support

Hi @akaur ,
Thank you for reaching out microsoft fabric community forum.

May we know how exactly is the solutions provided by @spencer_sa  and @nilendraFabric  are not meeting your exact requirement, based on that info we will try to look for a better solution.
Thank you.

Both their solutions are good. but the default lakehouses get overriden when I deploy the 2nd time from dev to UAT. Its hard to create a depoyment rule one by one for each NB. So I was wondering if there is an API call that I can make to create/update deployment rules for all NBs altogther in one shot that updates the default lakehouse. 

But I have created a function that loops through all NBs that needs an updated default lakehouse using @nilendraFabric  `s code. 

 

# Get Azure AD access token
access_token = get_access_token(TENANT_ID, CLIENT_ID, CLIENT_SECRET)
api_url = f'https://api.fabric.microsoft.com/v1/workspaces/{Workspace_ID}/notebooks'
# Set up headers for subsequent calls
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {access_token}"
}

 

#get list of all NBs in that workspace

res = requests.get(api_url,headers=headers)
note = res.json().get("value",[])

 

 

def update_def_lakehouses(notebook_name, workspace_id):
try:
nb = json.loads(notebookutils.notebook.getDefinition(notebook_name, workspaceId=workspace_id))
except:
print("Error, check notebook & workspace id")

#check if default LH points to DEV then update
if nb['metadata']['dependencies']['lakehouse']['default_lakehouse'] == DevWorkspaceID:
notebookutils.notebook.updateDefinition(
name = notebook_name,
workspaceId=Workspace_ID,
defaultLakehouse="Your NewLH Name",
defaultLakehouseWorkspace=NewLHWorkspaceID
)
print(nb['metadata']['dependencies']['lakehouse'])


for i in note:
if i['type'] == 'Notebook':
update_def_lakehouses(i['displayName'], Workspace_ID )

btw, if you use the Sempy python library and FabricRESTClient then you don't need to generate your authentication token as it uses the credentials of the person running the notebook.

akaur
Regular Visitor

Thanks for the reply. but this lakehouse value would be overwritten every time I dpeloy from DEV to UAT 

spencer_sa
Super User
Super User

The default lakehouse is part of a notebook's definition, rather than the top level metadata.  You would need to use updatedefinition to change it.
See this link for more information:  https://fabric.guru/programmatically-removing-updating-default-lakehouse-of-a-fabric-notebook

If this helps, please consider Accepting as a Solution so others can find it more easily.

nilendraFabric
Community Champion
Community Champion

Hello @akaur 

 

there is no direct rest api endpoint to set the default lakehouse as of now

 

but to do it programmatically , please give this a try

 

from notebookutils import mssparkutils

mssparkutils.notebook.updateDefinition(
name="<notebook name>",
workspaceId="<workspace id of the notebook>",
defaultLakehouse="<lakehouse name of the new lakehouse>",
defaultLakehouseWorkspace="<workspace id of new lakehouse to be added>"
)

 

The `updateDefinition` function is part of the newer `notebookutils` library, which is the updated version of `mssparkutils`

 

please accept this solution if this is helpful 

 

thanks

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.