Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
aravindhsam8
Frequent Visitor

Notebook execution failed at Notebook service with http status code - '200'

I have a Master orchestration pipeline, which consist of 6 invoked pipelines running parallelly.

Within each invoked pipeline, there are 3 notebooks running in a series (ingestion notebook--->Silver Notebook ---> Gold Notebook).

I'm facing the below issue often in the random silver or gold notes books sparksql statement.

FYI,The tables provided in the statements are not used in any other notebooks, each notebook has it's own set of table list.

aravindhsam8_0-1729605369713.png

 

This is not only specific to "create table statement", it also occurs for "drop table if exist" statement too.

Master orchestration runs everyday which is schedule to run daily and this issue persist at least weekly thrice.

 

Thanks in advance.

6 REPLIES 6
richbenmintz
Solution Sage
Solution Sage

Are you able to slighlty refactor your code and use try -> except blocks in pyspark notebook with your sql code running in spark.sql(""), you can then catch the errors and see exactly what is going on.

 

You might also want to use consider using pyspark to load your bronze data into dataframe and write to dest lakehouse.

 

df = spark.read.format('delta').load('path_to_source_table')

df.write().format('delta').mode('overwrite').saveAsTable('path_to dest_table')


I hope this helps,
Richard

Did I answer your question? Mark my post as a solution! Kudos Appreciated!

Proud to be a Super User!


v-huijiey-msft
Community Support
Community Support

Hi @aravindhsam8 ,

 

Thanks for the reply from lbendlin .

 

Can you check the data pipeline and the owner of the notebook? A similar problem may occur when the owner of the notebook is removed from the workspace. You can also check if the owner of the pipeline and notebook has access to all the items you need for your run.

 

Make sure you are using the Spark 1.3 runtime.

 

Can you try to add a retry run pipeline? See if the problem can be reproduced.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

HI @v-huijiey-msft  , Just following back again on this.
Kindly help on the above issue.
Thanks in advance.

Hi @v-huijiey-msft , 
Thanks for the reply.

I have changed to Spark 1.3 runtime. I have also checked all the ownership of the items within the pipleine, everything are under my name. 

I have tried to provide retry option if failed for each notebook activity, the issue still persist, it fails on the first and succeed either on second or third.
But still the notbook fails, even though all the above options are checked.

HI @v-huijiey-msft , Just following back on this.
Kindly help on the above issue.
Thanks in advance.

lbendlin
Super User
Super User

This is not only specific to "create table statement", it also occurs for "drop table if exist" statement too.

 

I think these issues are related. Looks like your code sometimes fails to drop the table, and then it complans that it cannot create an existing table.

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.