Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
When using Fabric Runtime 1.3, calls to spark.catalog.functionExists(...) and spark.catalog.listFunctions() (or equivalents via Java/Scala sparkSession.catalog().functionExists(...)) throw a scala.NotImplementedError: an implementation is missing. in earlier runtimes (one month back) these worked (or at least did not crash).
Our code include logic like:
if (!sparkSession.catalog().functionExists(functionName)) {
sparkSession.udf().register(functionName, udfImpl);
}that logic now fails at the functionExists() call (i.e. the NotImplementedError is thrown before registering).
we tried to check by running
Py4JJavaError: An error occurred while calling oYYYY.listFunctions.
: scala.NotImplementedError: an implementation is missing
at scala.Predef$.$qmark$qmark$qmark(Predef.scala:288)
at com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.listFunctions(OnelakeExternalCatalog.scala:404)
at com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.$anonfun$listFunctions$1(OnelakeExternalCatalog.scala:627)
…
Expected Behavior
spark.catalog.functionExists("someName") should return true or false depending on whether a function with that name is registered, without throwing an error.
Environment / Configuration Details
Fabric runtime version: 1.3
Solved! Go to Solution.
Hi @Padam_Prakash,
This error means you’re calling a “functions API” on the external catalog (OneLake) that doesn’t implement function metadata in Fabric Runtime 1.3. In 1.3, OnelakeExternalCatalog.listFunctions and functionExists throw scala.NotImplementedError, so checks like spark.catalog.functionExists(...) now fail even before you try to (re)register your UDF. It’s expected in this runtime, though the change feels like a regression from earlier behavior.
com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.listFunctions → NotImplementedError
In both Scala and PySpark, registering the same name overwrites the previous temp function. So you can safely register every run.
Scala
// Replace “functionExists + conditional” with unconditional register spark.udf.register(functionName, udfImpl) // acts like “create or replace” for temp UDFs
Python
spark.udf.register(function_name, udf_impl) # overwrites same-named temp UDF
Use SHOW TEMPORARY FUNCTIONS LIKE (or SHOW USER FUNCTIONS LIKE) which resolves via the session’s function registry instead of the external catalog:
Scala
val exists =
spark.sql(s"SHOW TEMPORARY FUNCTIONS LIKE `${functionName}`").count() > 0
if (!exists) {
spark.udf.register(functionName, udfImpl)
}Python
exists = spark.sql(f"SHOW TEMPORARY FUNCTIONS LIKE `{function_name}`").count() > 0
if not exists:
spark.udf.register(function_name, udf_impl)If your UDF is SQL-expressible, you can lean on SQL’s replace semantics:
CREATE OR REPLACE TEMPORARY FUNCTION my_func AS 'com.example.udfs.MyFunc';
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @Padam_Prakash,
This error means you’re calling a “functions API” on the external catalog (OneLake) that doesn’t implement function metadata in Fabric Runtime 1.3. In 1.3, OnelakeExternalCatalog.listFunctions and functionExists throw scala.NotImplementedError, so checks like spark.catalog.functionExists(...) now fail even before you try to (re)register your UDF. It’s expected in this runtime, though the change feels like a regression from earlier behavior.
com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.listFunctions → NotImplementedError
In both Scala and PySpark, registering the same name overwrites the previous temp function. So you can safely register every run.
Scala
// Replace “functionExists + conditional” with unconditional register spark.udf.register(functionName, udfImpl) // acts like “create or replace” for temp UDFs
Python
spark.udf.register(function_name, udf_impl) # overwrites same-named temp UDF
Use SHOW TEMPORARY FUNCTIONS LIKE (or SHOW USER FUNCTIONS LIKE) which resolves via the session’s function registry instead of the external catalog:
Scala
val exists =
spark.sql(s"SHOW TEMPORARY FUNCTIONS LIKE `${functionName}`").count() > 0
if (!exists) {
spark.udf.register(functionName, udfImpl)
}Python
exists = spark.sql(f"SHOW TEMPORARY FUNCTIONS LIKE `{function_name}`").count() > 0
if not exists:
spark.udf.register(function_name, udf_impl)If your UDF is SQL-expressible, you can lean on SQL’s replace semantics:
CREATE OR REPLACE TEMPORARY FUNCTION my_func AS 'com.example.udfs.MyFunc';
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @tayloramy ,
Thanks for the response. It aligns perfectly with what I was seeing.
What ultimately worked for me:
I created a new Lakehouse, but this time left the “Lakehouse Schemas (Public Preview)” box unchecked.
This forced Fabric to use the classic OnelakeExternalCatalog path, where the function APIs are still implemented.
After that, my spark.catalog.functionExists() and UDF registration logic worked exactly as before.
Here is what i found written issue in their official docs
https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-schemas?#public-preview-limitati...
Hi @Padam_Prakash ,
Thank you for the update. As your issue has been resolved, please mark it as the solution so that other community members with the same problem can easily find the answer.
Best Regards,
Tejaswi.
Community Support
Hi @Padam_Prakash,
Glad to hear the problem is resolved.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.