Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
s_hampe
New Member

Incompatibility between synapseutils_2.12:1.6.2 (Scala) and Fabric notebookutils.lakehouse.list

This post is part issue-report, part workaround suggestion.

Issue

We are running Spark Jobs written in Scala and we use 

com.microsoft.azure.synapse:synapseutils_2.12:1.6.2

to access (among other things) notebookutils. I can find no official documentation that indicates that synapseutils is meant to be used with Fabric as well, but as it contains notebookutils (which is a Fabric utility library), that seems implied.

This library exposes dummy stubs, so calls to notebookutils can compile locally. 

In particular we want to call the following function on notebookutils.lakehouse:

def list(workspaceId: String = ???): Array[Artifact]

However, when running the job in Fabric, this function call triggers the following error:

java.lang.NoSuchMethodError:
notebookutils.lakehouse$.list(java.lang.String)

 

 

The (Python) signature in the official documentation is

# List all Lakehouse artifacts
list(workspaceId: String = "", maxResults: Int = 1000): Array[Artifact]

 

Apparently, the default arguments are not translated correctly in the Scala version. In particular, the Scala version does not expose a version of list with 2 arguments. 

 

Note: Calling notebookutils.lakehouse.list(wsid) within a Spark Scala Notebook works without issue, the problem lies solely within the (dummy) function stubs provided by the synapseutils library. 

Workaround

A very very ugly workaround is to apply reflection to the notebookutils.lakehouses object at runtime to find the 2-argument version of list:

object FabricLakehouseCompat {

  def listLakehouses(workspaceId: String, maxResults: Int = 1000): Array[AnyRef] = {
    val lh = notebookutils.lakehouse
    val cls = lh.getClass

    val method = cls.getMethod(
      "list",
      classOf[String],
      java.lang.Integer.TYPE
    )

    method
      .invoke(lh, workspaceId, Int.box(maxResults))
      .asInstanceOf[Array[AnyRef]]
  }

}

 

Further reflection can be applied to convert the list of AnyRefs to Artifacts again.

1 ACCEPTED SOLUTION
Tamanchu
Advocate IV
Advocate IV

Hi @s_hampe ,

Great investigation, your reflection-based workaround is solid, and I can confirm this is a known limitation rather than a simple bug.

Root Cause: NotebookUtils is not supported in Spark Job Definitions

The official Microsoft documentation for NotebookUtils explicitly states:

"Notebook utilities aren't applicable for Apache Spark job definitions (SJD)."

This means the notebookutils.lakehouse.list() method (and other notebookutils APIs) are designed to run inside Fabric Notebooks only, not in compiled Spark Job Definitions. The synapseutils_2.12:1.6.2 library provides compilation stubs, but these stubs are intentionally incomplete, they were never meant to fully replicate the runtime API available inside Notebooks.

That's why the method signature mismatch exists:

  • Runtime (Notebook): list(workspaceId: String = "", maxResults: Int = 1000)
  • Stubs (synapseutils JAR): list(workspaceId: String) missing the maxResults parameter
    Your Workaround is the Right Approach

Your Java reflection approach is actually the correct pattern for this scenario. Since the stubs don't match the runtime, reflection bypasses the compilation check and calls the method directly at runtime.

Alternative: Use the Fabric REST API

For Spark Job Definitions, the recommended approach is to use the Fabric REST API instead of notebookutils:

Tamanchu_0-1772448312787.png

The Lakehouse Management REST API provides full CRUD operations and works reliably in both Notebooks and Spark Job Definitions.

Suggestion to Microsoft

It would be helpful if the synapseutils library either:

  1. Matched the full runtime signatures (even for SJD stubs)
  2. Or threw a clear UnsupportedOperationException with a message like "notebookutils.lakehouse is not supported in Spark Job Definitions"

This would save developers significant debugging time.

Hope this helps! Feel free to ask if you have follow-up questions.

 

View solution in original post

4 REPLIES 4
v-kpoloju-msft
Community Support
Community Support

Hi @s_hampe

Thank you for reaching out to the Microsoft Fabric Community Forum. Also, thanks to @Tamanchu, for those inputs on this thread.

Has your issue been resolved? If the response provided by the community member @Tamanchu, addressed your query, could you please confirm? It helps us ensure that the solutions provided are effective and beneficial for everyone.

Hope this helps clarify things and let me know what you find after giving these steps a try happy to help you investigate this further.

Thank you for using the Microsoft Community Forum.

s_hampe
New Member

Hi @Tamanchu !

 

Thank you for your feedback.

As for the support for Spark Jobs: The notice you mention ("Notebook utilities aren't applicable for Apache Spark job definitions (SJD)." is found in this section:

https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities#notebook-utilities

 

which refers to a subset of functions of NotebookUtils intended specifically for Notebooks (notebookutils.notebook). It makes complete sense that this particular subset is not applicable to Spark Jobs.

 

For the other functions, no such notice is present - so I would still consider it a bug in the synapseutils library.

I agree with your suggestion though: synapseutils should match the correct signatures.

Hello @s_hampe,

You’re right that the documentation explicitly mentions only the notebook-specific namespace. My interpretation was based on the observed behavior in SJD where lakehouse APIs are partially stubbed.
If the behavior isn’t officially documented as unsupported, then this likely deserves clarification or a fix in the synapseutils library.

Tamanchu
Advocate IV
Advocate IV

Hi @s_hampe ,

Great investigation, your reflection-based workaround is solid, and I can confirm this is a known limitation rather than a simple bug.

Root Cause: NotebookUtils is not supported in Spark Job Definitions

The official Microsoft documentation for NotebookUtils explicitly states:

"Notebook utilities aren't applicable for Apache Spark job definitions (SJD)."

This means the notebookutils.lakehouse.list() method (and other notebookutils APIs) are designed to run inside Fabric Notebooks only, not in compiled Spark Job Definitions. The synapseutils_2.12:1.6.2 library provides compilation stubs, but these stubs are intentionally incomplete, they were never meant to fully replicate the runtime API available inside Notebooks.

That's why the method signature mismatch exists:

  • Runtime (Notebook): list(workspaceId: String = "", maxResults: Int = 1000)
  • Stubs (synapseutils JAR): list(workspaceId: String) missing the maxResults parameter
    Your Workaround is the Right Approach

Your Java reflection approach is actually the correct pattern for this scenario. Since the stubs don't match the runtime, reflection bypasses the compilation check and calls the method directly at runtime.

Alternative: Use the Fabric REST API

For Spark Job Definitions, the recommended approach is to use the Fabric REST API instead of notebookutils:

Tamanchu_0-1772448312787.png

The Lakehouse Management REST API provides full CRUD operations and works reliably in both Notebooks and Spark Job Definitions.

Suggestion to Microsoft

It would be helpful if the synapseutils library either:

  1. Matched the full runtime signatures (even for SJD stubs)
  2. Or threw a clear UnsupportedOperationException with a message like "notebookutils.lakehouse is not supported in Spark Job Definitions"

This would save developers significant debugging time.

Hope this helps! Feel free to ask if you have follow-up questions.

 

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.