Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
aajain
New Member

Facing error while running a Spark Job Definition in Microsoft Fabric

Hi,

 

I was trying out a basic scala program as a Spark Job definition on Microsoft Fabric. I am observing is my main class executes perfectly, but the session fails.

 

Main method:

val spark = SparkSession.builder.appName("Sample Spark Session").master("local[*]").getOrCreate()
val sqlContext = spark.sqlContext
var df = sqlContext.sql("Select * from delta.`<abfss path>`")

 

In the post step, I am seeing the following error in the stderr:

 

2024-02-08 08:40:09,339 ERROR Logger [spark-listener-group-shared]: Failed to flush.
java.lang.NullPointerException
at com.microsoft.azure.synapse.diagnostic.SparkObservabilityBus.flush(SparkObservabilityBus.java:300)
at com.microsoft.azure.synapse.diagnostic.SparkObservabilityBus.flushSparkListenerEvent(SparkObservabilityBus.java:230)
at org.apache.spark.listeners.SparkObservabilityListener.onApplicationEnd(SparkObservabilityListener.scala:34)
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:57)
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:120)
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:104)
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:127)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:127)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:121)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$3.$anonfun$run$4(AsyncEventQueue.scala:117)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1471)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$3.run(AsyncEventQueue.scala:117)

 

Has anyone seen this before or has any idea around this?

 

Thanks 

1 REPLY 1
Anonymous
Not applicable

Hi @aajain ,

If I understand correctly, the issue is that you encountered the error while running Spark Job. Please try the following methods and check if they can solve your problem:

1.Verify that the resources specified in the Spark job are correctly configured.

Create an Apache Spark job definition - Microsoft Fabric | Microsoft Learn

 

2.Ensure that all necessary dependencies are accessible.

 

3.You can also view the following links to learn information.

Run an Apache Spark job definition - Microsoft Fabric | Microsoft Learn

Solved: Spark Job Definition: Spark_Ambiguous_NonJvmUserAp... - Microsoft Fabric Community

 

Best Regards,

Wisdom Wu

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.