Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Anonymous
Not applicable

Writing to lakehouse via notebook within notebook within pipeline impossible

I have a notebook that writes data into a lakehouse table:

 

 

# Create dataframe
dataframe_dimentity = spark.createDataFrame(
    [
        (69420, "random_data"),
    ],
    ["col1", "col2"]
)

# Write to lakehouse
dataframe_dimentity.write.mode("overwrite").format("delta").save(f"Tables/00Deez_Nuts")

 

 

This works completely fine.

 

Then I have a notebook that runs that notebook:

 

 

mssparkutils.notebook.run("Table_Write", 120)

 

 

Again, works completely fine.

 

However, if I run the second notebook via pipeline, it times out. If I run the first notebook via pipeline, it works fine.

If I use the ABFS path, it works fine. However, I can't hard code any IDs, because the notebook takes part in a deployment process with changing default lakehouses. So what do I do?

 

EDIT: Interestingly, with a kind of flexible ABFS path it doesn't work.

# Get lakehouse ABFS path
lakehouse_path = list(filter(lambda x: x["mountPoint"] == "/default", mssparkutils.fs.mounts()))[0]["source"]
# Write to lakehouse
dataframe_dimentity.write.mode("overwrite").option("overwriteSchema", True).format("delta").save(f"{lakehouse_path}/Tables/00Deez_Nuts")

So the problem seems to be that notebook by notebook execution within a pipeline is a context where the mount points become ambiguous...

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Thank you, the lack of capacity resource is a valuable hint and was, partly, my problem yesterday. However, I have one instance of this phenomenon where there is no shortage of resources, and then instead of timing out it gives me "Error 400 Bad Request" (Authorization bearer token not present, something something), every time. Which I interpret as it still can't find the lakehouse.

 

I found a workaround yesterday by giving the ABFS lakehouse path via pipeline parameter. This works. Still, I think this should work without any extras, just by accessing the default lakehouse of the notebook itself.

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Hi @Anonymous ,

 

I have a Notebook 1, Notebook 2 running Notebook 1, Pipeline 2 running Notebook 2, the last run is successful.

vhuijieymsft_0-1733278325709.png

vhuijieymsft_1-1733278325712.png

vhuijieymsft_2-1733278343940.png

vhuijieymsft_3-1733278343944.png

 

Running a notebook in Pipeline may cause a timeout if there are not enough resources.

 

Try increasing the timeout to a higher value, such as 300 seconds:

mssparkutils.notebook.run(“Table_Write”, 300)

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Anonymous
Not applicable

Thank you, the lack of capacity resource is a valuable hint and was, partly, my problem yesterday. However, I have one instance of this phenomenon where there is no shortage of resources, and then instead of timing out it gives me "Error 400 Bad Request" (Authorization bearer token not present, something something), every time. Which I interpret as it still can't find the lakehouse.

 

I found a workaround yesterday by giving the ABFS lakehouse path via pipeline parameter. This works. Still, I think this should work without any extras, just by accessing the default lakehouse of the notebook itself.

Anonymous
Not applicable

Hi @Anonymous ,

 

Thank you for providing a workaround, could you please mark this helpful post as “Answered”?

 

If others in the community are experiencing the same problem as you, this will help them find a solution easily.

 

Thank you for your cooperation!

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors