Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Dietz
New Member

Fabric Notebook Error, please help!

Hi There

 

Would really appreciate any assistance.

Will try keep this short.... I have a notebook in Fabric, at the end of the notebook a try to write the dataframe to a delta a table.

With this code.

 

matched_customers_df.write.format("delta").option("overwriteSchema", "true").mode("overwrite").saveAsTable("Lakehouse.gold_table_final")
 
This dataframe has 5.1 million rows.
 
But I get the following error.
 
org.apache.spark.SparkException: Job aborted due to stage failure: Serialized task 123:0 was 156335848 bytes, which exceeds max allowed: spark.rpc.message.maxSize (134217728 bytes). Consider increasing spark.rpc.message.maxSize or using broadcast variables for large values.
 
I have searched for answer with ChatGPT and Copilot, and they both reccomend chaning the spark seting with either, 
spark.conf.set("spark.rpc.message.maxSize", "200MB") spark.conf.set("spark.driver.maxResultSize", "2g")
 
But you cannot execute this statement in Fabric notebooks. As you are not allowed to change the spark settings directly in Fabric.
I have also tried to parition the data and the delta table but they all get the same error.
 
I also cannot change these setting in the Fabric workspace spark configuration.
 
As a last resort I increased the Fabric capacity from F8 to F16 and then F32, still the same error.
Please help myself and chatGPT are out of ideas.
2 ACCEPTED SOLUTIONS
FabianSchut
Solution Sage
Solution Sage

Hi, can you create a custom environment in Fabric and change the spark settings there? You can change the settings in the spark properties field as shown in the screenshot. You can select that custom environment in your notebook.

FabianSchut_0-1731110141709.png

 

View solution in original post

torsten
Helper II
Helper II

The screenshot is part of the "Environment"-item which you can create by clicking on "+ New item" in your workspace and search for "environment". Afterwards you can attach the environment in the "Workspace settings" and "Data Engineering/Science" / "Spark settings" / "Environment". More information can be found here: Create, configure, and use an environment in Fabric - Microsoft Fabric | Microsoft Learn

View solution in original post

5 REPLIES 5
torsten
Helper II
Helper II

The screenshot is part of the "Environment"-item which you can create by clicking on "+ New item" in your workspace and search for "environment". Afterwards you can attach the environment in the "Workspace settings" and "Data Engineering/Science" / "Spark settings" / "Environment". More information can be found here: Create, configure, and use an environment in Fabric - Microsoft Fabric | Microsoft Learn

Awesome, Thank You!

Dietz
New Member

or anyone who wants to know the complete solution , I created a custom environemnt and set the node sice to large, and teh issue was resolved.

FabianSchut
Solution Sage
Solution Sage

Hi, can you create a custom environment in Fabric and change the spark settings there? You can change the settings in the spark properties field as shown in the screenshot. You can select that custom environment in your notebook.

FabianSchut_0-1731110141709.png

 

Thansk FabianSchut, really appreciaet your help, but I cannot seem to get to the screen in yoru message, I have created a custom environment and gone to eedit but I dont see teh screen that you posted?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.