Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I am automating notebook provisioning in Microsoft Fabric and need to programmatically set or update the default Lakehouse for a notebook. I have tried both the REST API and the built-in Fabric Python SDK (notebookutils.notebook.updateDefinition). My findings:
Endpoint:
POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/notebooks/{notebookId}/updateDefinitionPayload Example:
{
"name": "Notebook 1",
"definition": {
"format": "ipynb",
"parts": [
{
"path": "generate_data.ipynb",
"payload": "<BASE64>",
"payloadType": "InlineBase64"
}
]
},
"defaultLakehouse": "<lakehouse_id>"
}Code Example (executed inside a Fabric notebook):
(notebookutils .notebook .updateDefinition( name = "Notebook 1", defaultLakehouse = "<lakehouse_id>" # workspaceId and defaultLakehouseWorkspace seem optional in SDK ) )
Hi @stuckerj,
Thank you for reaching out to Microsoft Fabric Community.
Thank you for sharing the detailed analysis, yes there is a difference between the REST API and the notebookutils behavior.
At present, updating a notebook’s default Lakehouse is not fully supported through the public REST API. When an .ipynb payload is sent, the API accepts it, returns 202 Accepted but never completes the async update, this is a known product limitation.
The supported and working method is using notebookutils.notebook.updateDefinition from another notebook. It runs inside fabric with the correct context. If REST must be used then it requires a full .py notebook definition and still may not complete fully.
Thanks and regards,
Anjan Kumar Chippa
Thank you for confirming the limitation.
As a workaround, I was thinking about creating another Notebook programmatically, for the purpose of setting data connections for other Notebooks in the pipeline. It sounds like that's a solid option.
Hi @stuckerj,
Yes, that’s exactly the right approach. Creating a separate notebook to programmatically configure or update other notebooks is a solid and supported workaround. It works because the command runs inside the fabric and has the right access to update lakehouse settings properly.
Thanks and regards,
Anjan Kumar Chippa
| User | Count |
|---|---|
| 14 | |
| 7 | |
| 2 | |
| 2 | |
| 2 |