<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Notebook Spark Custom JDBC error converting timestamp in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4833371#M12486</link>
    <description>&lt;P&gt;Hi Thomas,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It's a Custom JDBC driver that has been downloaded. So it's the same version, just a different Platform. My hypothesis would be that there are some package differences between Fabric and Databricks causing this.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I just kept this workload on Databricks for now, mapping to a string and converting later is also a solution that works, but it requires too much work in the metadata-driven for each loop to do it for all tables.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks anyways!&lt;/P&gt;</description>
    <pubDate>Tue, 23 Sep 2025 11:50:58 GMT</pubDate>
    <dc:creator>stan01</dc:creator>
    <dc:date>2025-09-23T11:50:58Z</dc:date>
    <item>
      <title>Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4778513#M11303</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to import data into a lakehouse via a custom JDBC Driver (From Infor M3 ERP).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am able to load the table with the same SQL and Python code on Azure Databricks, but using the same code on MS Fabric returns me the following error: '&lt;SPAN&gt;Unrecognized SQL type - name: TIMESTAMP WITH TIME ZONE'. Both are using Spark 3.5.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is is possible a spark setting which is enabled on Databricks is not in Fabric?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have also been able to query data by for example converting timestamps: '&lt;SPAN&gt;SELECT CAST(timestamp AS VARCHAR)', however, I would need to declare the schema on import (over 100+ columns), and I don't know which columns are datetimes.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is the script I have:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;df = (
    spark.read
         .format("jdbc")
         .option("url", api_key)
         .option("driver", "com.infor.idl.jdbc.Driver")
         .option("preferTimestampNTZ", True)
         .option("query", "SELECT * FROM FGLEDG LIMIT 10")
         .load()
)
display(df)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And I get the following error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Py4JJavaError                             Traceback (most recent call last)
Cell In[10], line 8
      1 df = (
      2     spark.read
      3          .format("jdbc")
      4          .option("url", api_key)
      5          .option("driver", "com.infor.idl.jdbc.Driver")
      7          .option("query", "SELECT * FROM FGLEDG LIMIT 10")
----&amp;gt; 8          .load()
     9 )
     10 display(df)

File /opt/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py:314, in DataFrameReader.load(self, path, format, schema, **options)
    312     return self._df(self._jreader.load(self._spark._sc._jvm.PythonUtils.toSeq(path)))
    313 else:
--&amp;gt; 314     return self._df(self._jreader.load())

File ~/cluster-env/trident_env/lib/python3.11/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args)
   1316 command = proto.CALL_COMMAND_NAME +\
   1317     self.command_header +\
   1318     args_command +\
   1319     proto.END_COMMAND_PART
   1321 answer = self.gateway_client.send_command(command)
-&amp;gt; 1322 return_value = get_return_value(
   1323     answer, self.gateway_client, self.target_id, self.name)
   1325 for temp_arg in temp_args:
   1326     if hasattr(temp_arg, "_detach"):

File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:179, in capture_sql_exception.&amp;lt;locals&amp;gt;.deco(*a, **kw)
    177 def deco(*a: Any, **kw: Any) -&amp;gt; Any:
    178     try:
--&amp;gt; 179         return f(*a, **kw)
    180     except Py4JJavaError as e:
    181         converted = convert_exception(e.java_exception)

File ~/cluster-env/trident_env/lib/python3.11/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--&amp;gt; 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))

Py4JJavaError: An error occurred while calling o7255.load.
: org.apache.spark.SparkSQLException: [UNRECOGNIZED_SQL_TYPE] Unrecognized SQL type - name: TIMESTAMP WITH TIME ZONE, id: TIMESTAMP_WITH_TIMEZONE.
	at org.apache.spark.sql.errors.QueryExecutionErrors$.unrecognizedSqlTypeError(QueryExecutionErrors.scala:996)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getCatalystType(JdbcUtils.scala:228)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getSchema$1(JdbcUtils.scala:308)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:308)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.getQueryOutputSchema(JDBCRDD.scala:71)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:58)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:241)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:37)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:346)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:236)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:219)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:219)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.base/java.lang.Thread.run(Thread.java:829)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 28 Jul 2025 11:34:40 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4778513#M11303</guid>
      <dc:creator>stan01</dc:creator>
      <dc:date>2025-07-28T11:34:40Z</dc:date>
    </item>
    <item>
      <title>Re: Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4820281#M12204</link>
      <description>&lt;P&gt;It could be the JDBC driver is different in Fabric than databricks. Check the versions. You can load a different one in Fabric if needed, but it would have to be on a custon spark environment.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You could also mapp to a string and convert later.&lt;/P&gt;</description>
      <pubDate>Mon, 08 Sep 2025 15:12:20 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4820281#M12204</guid>
      <dc:creator>Thomaslleblanc</dc:creator>
      <dc:date>2025-09-08T15:12:20Z</dc:date>
    </item>
    <item>
      <title>Re: Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4821974#M12259</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1291138"&gt;@stan01&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks for reaching out to the Microsoft fabric community forum.&lt;/P&gt;
&lt;P&gt;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/498348"&gt;@Thomaslleblanc&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Thanks for your prompt response&lt;/P&gt;
&lt;P&gt;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1291138"&gt;@stan01&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I wanted to follow up and confirm whether you’ve had the opportunity to review the information provide by &lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/498348"&gt;@Thomaslleblanc&lt;/a&gt;&amp;nbsp;. Should you have any questions or require further clarification, please don't hesitate to reach out.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We appreciate your engagement and thank you for being an active part of the community.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Best regards,&lt;BR /&gt;Lakshmi&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 10 Sep 2025 05:58:03 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4821974#M12259</guid>
      <dc:creator>v-lgarikapat</dc:creator>
      <dc:date>2025-09-10T05:58:03Z</dc:date>
    </item>
    <item>
      <title>Re: Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4826032#M12334</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1291138"&gt;@stan01&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&lt;FONT&gt;&lt;SPAN&gt;We’d like to confirm whether your issue has been successfully resolved. If you still have any questions or need further assistance, please don’t hesitate to reach out. We’re more than happy to continue supporting you.&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;We appreciate your engagement and thank you for being an active part of the community.&lt;/P&gt;
&lt;P&gt;&lt;FONT&gt;&lt;BR /&gt;&lt;STRONG&gt;Best Regards,&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;Lakshmi.&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 15 Sep 2025 08:16:12 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4826032#M12334</guid>
      <dc:creator>v-lgarikapat</dc:creator>
      <dc:date>2025-09-15T08:16:12Z</dc:date>
    </item>
    <item>
      <title>Re: Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4832070#M12457</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;A href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/1291138" target="_blank"&gt;@stan01&lt;/A&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&lt;FONT&gt;&lt;SPAN&gt;We’d like to confirm whether your issue has been successfully resolved. If you still have any questions or need further assistance, please don’t hesitate to reach out. We’re more than happy to continue supporting you.&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;We appreciate your engagement and thank you for being an active part of the community.&lt;/P&gt;
&lt;P&gt;&lt;FONT&gt;&lt;BR /&gt;&lt;STRONG&gt;Best Regards,&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;Lakshmi.&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 22 Sep 2025 09:11:33 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4832070#M12457</guid>
      <dc:creator>v-lgarikapat</dc:creator>
      <dc:date>2025-09-22T09:11:33Z</dc:date>
    </item>
    <item>
      <title>Re: Notebook Spark Custom JDBC error converting timestamp</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4833371#M12486</link>
      <description>&lt;P&gt;Hi Thomas,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It's a Custom JDBC driver that has been downloaded. So it's the same version, just a different Platform. My hypothesis would be that there are some package differences between Fabric and Databricks causing this.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I just kept this workload on Databricks for now, mapping to a string and converting later is also a solution that works, but it requires too much work in the metadata-driven for each loop to do it for all tables.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks anyways!&lt;/P&gt;</description>
      <pubDate>Tue, 23 Sep 2025 11:50:58 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Notebook-Spark-Custom-JDBC-error-converting-timestamp/m-p/4833371#M12486</guid>
      <dc:creator>stan01</dc:creator>
      <dc:date>2025-09-23T11:50:58Z</dc:date>
    </item>
  </channel>
</rss>

