Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
davemc69
Frequent Visitor

FORMAT DATETIME Field spark sql notebook cell

 

Attempting to use code that works in SSMS summarizing table by date and formatting the date to 'yyyy-MMM-dd'.

SELECT COUNT(1) AS RecordCount, FORMAT(DateTime2_Column, 'yyyy-MM-dd') AS FormattedDateField
FROM lakehouse.Schema.TableName
WHERE DateTime2_Column>= CAST('2024-01-01 00:00:00' AS TIMESTAMP)
GROUP BY FORMAT(DateTime2_Column, 'yyyy-MM-dd')
ORDER BY FORMAT(DateTime2_Column, 'yyyy-MM-dd') DESC;
 
I have verified that the column is a datetime2 data type. Even checked for nulls. Everything that I have found via seraching states that this should work. 

 

The cell code throws anerror : “an implementation is missing scala.Predef$.$qmark$qmark$qmark(Predef.scala:288) com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.functionExists(OnelakeExternalCatalog.scala:386) com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.$anonfun$functionExists$1(OnelakeExternalCatalog.scala:607…)

 

I have tried the convert equivalent and also casting the field to a date prior to format. Still receive an error. Ive exhausted my googling capabilities today – everything that I read says that it should work.

1 ACCEPTED SOLUTION
shashiPaul1570_
Resolver IV
Resolver IV

Hi @davemc69
Thank you for sharing this problem. 
You're encountering this issue because the FORMAT() function you're using is valid in T-SQL (SSMS) but not supported in Spark SQL, which is what Fabric notebooks are based on.

 

Why the error occurs

Spark SQL does not support the FORMAT function. Instead, it uses a different function called date_format for formatting datetime values. That’s why you're seeing an error like

scala.Predef$.$qmark$qmark$qmark ... functionExists ...

Which means, Spark saying "I don't recognize this function."

How to fix it

Replace FORMAT with date_format in your SQL cell.

As per your existing query

SELECT 
  COUNT(1) AS RecordCount, 
  date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM 
  lakehouse.Schema.TableName
WHERE 
  DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY 
  date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY 
  FormattedDateField DESC

This version is compatible with Fabric notebooks and will produce the same result you're expecting.

 

For your reference, you can also follow this link

https://spark.apache.org/docs/latest/sql-ref-functions.html#date_format

 

Let me know if this resolves your issue — and if helpful, please mark this as the solution to assist others facing the same challenge.
Thanks!
– Shashi Paul | Fabric Community Member

View solution in original post

6 REPLIES 6
v-ssriganesh
Community Support
Community Support

Hello @davemc69

Could you please confirm if your query has been resolved by the provided solutions? This would be helpful for other members who may encounter similar issues.

 

Thank you for being part of the Microsoft Fabric Community.

v-ssriganesh
Community Support
Community Support

Hello @davemc69,

We hope you're doing well. Could you please confirm whether your issue has been resolved or if you're still facing challenges? Your update will be valuable to the community and may assist others with similar concerns.

Thank you.

v-ssriganesh
Community Support
Community Support

Hello @davemc69,

Hope everything’s going great with you. Just checking in has the issue been resolved or are you still running into problems? Sharing an update can really help others facing the same thing.

Thank you.

 

v-ssriganesh
Community Support
Community Support

Hello @davemc69,
Thank you for reaching out to the Microsoft Fabric Community Forum and thanks @Nasif_Azam & @shashiPaul1570_ for sharing your valuable insights.

I have reproduced your scenario using Microsoft Fabric Notebook by creating a sample table with datetime values. As you correctly observed, the FORMAT() function works in T-SQL (SQL Server Management Studio), but it is not supported in Spark SQL, which is why you're encountering the "missing implementation" error.

To achieve the same formatted output in Spark SQL, the equivalent function is date_format().

Working Spark SQL Query:

SELECT

  COUNT(1) AS RecordCount,

  date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField

FROM SampleTable

WHERE DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')

GROUP BY date_format(DateTime2_Column, 'yyyy-MMM-dd')

ORDER BY FormattedDateField DESC;

 

Output (Sample):

vssriganesh_0-1754300895724.png

 

This query successfully groups and formats the datetime column as required.

Best regards,
Ganesh Singamshetty.

Nasif_Azam
Super User
Super User

Hey  @davemc69 ,

You are running into this issue because Spark SQL does not support the FORMAT() function, which is valid in T-SQL (like in SSMS) but not in Spark's SQL dialect especially when used in environments like Fabric Spark Notebooks or Lakehouse queries. Instead, you should use date_format(), which is Spark SQL’s equivalent for formatting datetime values.

 

Try the query in Spark SQL:

SELECT 
  COUNT(1) AS RecordCount,
  date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM lakehouse.Schema.TableName
WHERE DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY FormattedDateField DESC;

 

 If you found this solution helpful, please consider accepting it and giving it a kudos (Like) it’s greatly appreciated and helps others find the solution more easily.


Best Regards,
Nasif Azam



Did I answer your question?
If so, mark my post as a solution!
Also consider helping someone else in the forums!

Proud to be a Super User!


LinkedIn
shashiPaul1570_
Resolver IV
Resolver IV

Hi @davemc69
Thank you for sharing this problem. 
You're encountering this issue because the FORMAT() function you're using is valid in T-SQL (SSMS) but not supported in Spark SQL, which is what Fabric notebooks are based on.

 

Why the error occurs

Spark SQL does not support the FORMAT function. Instead, it uses a different function called date_format for formatting datetime values. That’s why you're seeing an error like

scala.Predef$.$qmark$qmark$qmark ... functionExists ...

Which means, Spark saying "I don't recognize this function."

How to fix it

Replace FORMAT with date_format in your SQL cell.

As per your existing query

SELECT 
  COUNT(1) AS RecordCount, 
  date_format(DateTime2_Column, 'yyyy-MMM-dd') AS FormattedDateField
FROM 
  lakehouse.Schema.TableName
WHERE 
  DateTime2_Column >= TIMESTAMP('2024-01-01 00:00:00')
GROUP BY 
  date_format(DateTime2_Column, 'yyyy-MMM-dd')
ORDER BY 
  FormattedDateField DESC

This version is compatible with Fabric notebooks and will produce the same result you're expecting.

 

For your reference, you can also follow this link

https://spark.apache.org/docs/latest/sql-ref-functions.html#date_format

 

Let me know if this resolves your issue — and if helpful, please mark this as the solution to assist others facing the same challenge.
Thanks!
– Shashi Paul | Fabric Community Member

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Fabric Update Carousel

Fabric Monthly Update - October 2025

Check out the October 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.