Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Attempting to use code that works in SSMS summarizing table by date and formatting the date to 'yyyy-MMM-dd'.
The cell code throws anerror : “an implementation is missing scala.Predef$.$qmark$qmark$qmark(Predef.scala:288) com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.functionExists(OnelakeExternalCatalog.scala:386) com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.$anonfun$functionExists$1(OnelakeExternalCatalog.scala:607…)
I have tried the convert equivalent and also casting the field to a date prior to format. Still receive an error. Ive exhausted my googling capabilities today – everything that I read says that it should work.
Solved! Go to Solution.
Hi @davemc69,
Thank you for sharing this problem.
You're encountering this issue because the FORMAT() function you're using is valid in T-SQL (SSMS) but not supported in Spark SQL, which is what Fabric notebooks are based on.
Spark SQL does not support the FORMAT function. Instead, it uses a different function called date_format for formatting datetime values. That’s why you're seeing an error like
scala.Predef$.$qmark$qmark$qmark ... functionExists ...Which means, Spark saying "I don't recognize this function."
Replace FORMAT with date_format in your SQL cell.
As per your existing query
SELECT
COUNT(1) AS RecordCount,
date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM
lakehouse.Schema.TableName
WHERE
DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY
date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY
FormattedDateField DESCThis version is compatible with Fabric notebooks and will produce the same result you're expecting.
For your reference, you can also follow this link
https://spark.apache.org/docs/latest/sql-ref-functions.html#date_format
Let me know if this resolves your issue — and if helpful, please mark this as the solution to assist others facing the same challenge.
Thanks!
– Shashi Paul | Fabric Community Member
Hello @davemc69,
Could you please confirm if your query has been resolved by the provided solutions? This would be helpful for other members who may encounter similar issues.
Thank you for being part of the Microsoft Fabric Community.
Hello @davemc69,
We hope you're doing well. Could you please confirm whether your issue has been resolved or if you're still facing challenges? Your update will be valuable to the community and may assist others with similar concerns.
Thank you.
Hello @davemc69,
Hope everything’s going great with you. Just checking in has the issue been resolved or are you still running into problems? Sharing an update can really help others facing the same thing.
Thank you.
Hello @davemc69,
Thank you for reaching out to the Microsoft Fabric Community Forum and thanks @Nasif_Azam & @shashiPaul1570_ for sharing your valuable insights.
I have reproduced your scenario using Microsoft Fabric Notebook by creating a sample table with datetime values. As you correctly observed, the FORMAT() function works in T-SQL (SQL Server Management Studio), but it is not supported in Spark SQL, which is why you're encountering the "missing implementation" error.
To achieve the same formatted output in Spark SQL, the equivalent function is date_format().
Working Spark SQL Query:
SELECT
COUNT(1) AS RecordCount,
date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM SampleTable
WHERE DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY FormattedDateField DESC;
Output (Sample):
This query successfully groups and formats the datetime column as required.
Best regards,
Ganesh Singamshetty.
Hey @davemc69 ,
You are running into this issue because Spark SQL does not support the FORMAT() function, which is valid in T-SQL (like in SSMS) but not in Spark's SQL dialect especially when used in environments like Fabric Spark Notebooks or Lakehouse queries. Instead, you should use date_format(), which is Spark SQL’s equivalent for formatting datetime values.
Try the query in Spark SQL:
SELECT
COUNT(1) AS RecordCount,
date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM lakehouse.Schema.TableName
WHERE DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY FormattedDateField DESC;
If you found this solution helpful, please consider accepting it and giving it a kudos (Like) it’s greatly appreciated and helps others find the solution more easily.
Best Regards,
Nasif Azam
Hi @davemc69,
Thank you for sharing this problem.
You're encountering this issue because the FORMAT() function you're using is valid in T-SQL (SSMS) but not supported in Spark SQL, which is what Fabric notebooks are based on.
Spark SQL does not support the FORMAT function. Instead, it uses a different function called date_format for formatting datetime values. That’s why you're seeing an error like
scala.Predef$.$qmark$qmark$qmark ... functionExists ...Which means, Spark saying "I don't recognize this function."
Replace FORMAT with date_format in your SQL cell.
As per your existing query
SELECT
COUNT(1) AS RecordCount,
date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM
lakehouse.Schema.TableName
WHERE
DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY
date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY
FormattedDateField DESCThis version is compatible with Fabric notebooks and will produce the same result you're expecting.
For your reference, you can also follow this link
https://spark.apache.org/docs/latest/sql-ref-functions.html#date_format
Let me know if this resolves your issue — and if helpful, please mark this as the solution to assist others facing the same challenge.
Thanks!
– Shashi Paul | Fabric Community Member
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.