<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Delete error for a lakehouse table as an evenstream destination in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4263947#M4895</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I've a lakehouse table as a destination for an evenstream.&lt;/P&gt;&lt;P&gt;At regular time intervals a notebook must delete a data portion of this table, that has a field defined as a VARCHAR(100) in this way:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pmscorca_0-1730295121452.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1191810iE7D941C7542C1674/image-size/medium?v=v2&amp;amp;px=400" role="button" title="pmscorca_0-1730295121452.png" alt="pmscorca_0-1730295121452.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Respect to this specific field, eventstream writes on the lakehouse table some values with a lenght major of 100 chars.&lt;/P&gt;&lt;P&gt;When the delete rows is executed this error arises:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - Py4JJavaError, Error value - An error occurred while calling o322.sql.&lt;BR /&gt;: org.apache.spark.sql.delta.schema.DeltaInvariantViolationException: [DELTA_EXCEED_CHAR_VARCHAR_LIMIT] Exceeds char/varchar type length limitation. Failed check: (isnull('mycolumn) OR (length('mycolumn) &amp;lt;= 100)).&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.getCharVarcharLengthInvariantViolationException(InvariantViolationException.scala:64)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:89)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:112)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException.apply(InvariantViolationException.scala)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.CheckDeltaInvariant_12$(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields_0_1$(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.delta.constraints.DeltaInvariantCheckerExec.$anonfun$doExecute$3(DeltaInvariantCheckerExec.scala:89)&lt;BR /&gt;at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)&lt;BR /&gt;at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.writeWithIterator(FileFormatDataWriter.scala:117)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeTask$1(DeltaFileFormatWriter.scala:440)&lt;BR /&gt;at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1397)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.executeTask(DeltaFileFormatWriter.scala:447)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeWrite$2(DeltaFileFormatWriter.scala:284)&lt;BR /&gt;at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)&lt;BR /&gt;at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)&lt;BR /&gt;at org.apache.spark.scheduler.Task.run(Task.scala:141)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)&lt;BR /&gt;at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)&lt;BR /&gt;at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)&lt;BR /&gt;at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)&lt;BR /&gt;at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)&lt;BR /&gt;at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)&lt;BR /&gt;at java.base/java.lang.Thread.run(Thread.java:829)&lt;BR /&gt;&lt;BR /&gt;I think that an issue about the data schema has to arise both inserting (by eventstream) and deleting operations, and not only for the delete statement.&lt;/P&gt;&lt;P&gt;Any suggests to me, please? Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 30 Oct 2024 13:36:43 GMT</pubDate>
    <dc:creator>pmscorca</dc:creator>
    <dc:date>2024-10-30T13:36:43Z</dc:date>
    <item>
      <title>Delete error for a lakehouse table as an evenstream destination</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4263947#M4895</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I've a lakehouse table as a destination for an evenstream.&lt;/P&gt;&lt;P&gt;At regular time intervals a notebook must delete a data portion of this table, that has a field defined as a VARCHAR(100) in this way:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pmscorca_0-1730295121452.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1191810iE7D941C7542C1674/image-size/medium?v=v2&amp;amp;px=400" role="button" title="pmscorca_0-1730295121452.png" alt="pmscorca_0-1730295121452.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Respect to this specific field, eventstream writes on the lakehouse table some values with a lenght major of 100 chars.&lt;/P&gt;&lt;P&gt;When the delete rows is executed this error arises:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - Py4JJavaError, Error value - An error occurred while calling o322.sql.&lt;BR /&gt;: org.apache.spark.sql.delta.schema.DeltaInvariantViolationException: [DELTA_EXCEED_CHAR_VARCHAR_LIMIT] Exceeds char/varchar type length limitation. Failed check: (isnull('mycolumn) OR (length('mycolumn) &amp;lt;= 100)).&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.getCharVarcharLengthInvariantViolationException(InvariantViolationException.scala:64)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:89)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:112)&lt;BR /&gt;at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException.apply(InvariantViolationException.scala)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.CheckDeltaInvariant_12$(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields_0_1$(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)&lt;BR /&gt;at org.apache.spark.sql.delta.constraints.DeltaInvariantCheckerExec.$anonfun$doExecute$3(DeltaInvariantCheckerExec.scala:89)&lt;BR /&gt;at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)&lt;BR /&gt;at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.writeWithIterator(FileFormatDataWriter.scala:117)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeTask$1(DeltaFileFormatWriter.scala:440)&lt;BR /&gt;at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1397)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.executeTask(DeltaFileFormatWriter.scala:447)&lt;BR /&gt;at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeWrite$2(DeltaFileFormatWriter.scala:284)&lt;BR /&gt;at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)&lt;BR /&gt;at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)&lt;BR /&gt;at org.apache.spark.scheduler.Task.run(Task.scala:141)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)&lt;BR /&gt;at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)&lt;BR /&gt;at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)&lt;BR /&gt;at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)&lt;BR /&gt;at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)&lt;BR /&gt;at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)&lt;BR /&gt;at java.base/java.lang.Thread.run(Thread.java:829)&lt;BR /&gt;&lt;BR /&gt;I think that an issue about the data schema has to arise both inserting (by eventstream) and deleting operations, and not only for the delete statement.&lt;/P&gt;&lt;P&gt;Any suggests to me, please? Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 30 Oct 2024 13:36:43 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4263947#M4895</guid>
      <dc:creator>pmscorca</dc:creator>
      <dc:date>2024-10-30T13:36:43Z</dc:date>
    </item>
    <item>
      <title>Re: Delete error for a lakehouse table as an evenstream destination</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4264911#M4932</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/719515"&gt;@pmscorca&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I find a thread related to a similar error message, its solutions might be helpful:&lt;/P&gt;
&lt;P&gt;&lt;A href="https://community.databricks.com/t5/data-engineering/delta-exceed-char-varchar-limit/td-p/61283" target="_blank" rel="noopener"&gt;DELTA_EXCEED_CHAR_VARCHAR_LIMIT - Databricks Community&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best Regards,&lt;BR /&gt;Jing&lt;BR /&gt;&lt;EM&gt;If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos! &lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Oct 2024 05:16:07 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4264911#M4932</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-10-31T05:16:07Z</dc:date>
    </item>
    <item>
      <title>Re: Delete error for a lakehouse table as an evenstream destination</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4265235#M4934</link>
      <description>&lt;P&gt;Hi, thanks for your reply.&lt;/P&gt;&lt;P&gt;The&amp;nbsp;&lt;SPAN&gt;DELTA_EXCEED_CHAR_VARCHAR_LIMIT error refers to a specific field defined as VARCHAR(100).&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Some related values of the entered events can have a lenght major of 100 chars.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;When evenstream writes these values in the lakehouse table none errors occur, but when runs a delete notebook at regular time intervals a such error occurs.&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pmscorca_0-1730363739076.png" style="width: 999px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1192329i25E3F9723C9B05A0/image-size/large?v=v2&amp;amp;px=999" role="button" title="pmscorca_0-1730363739076.png" alt="pmscorca_0-1730363739076.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;There is a different behaviour: I expect or always the DELTA_EXCEED_CHAR_VARCHAR_LIMIT error for both inserting and deleting operations or never; I don't expect this error for only one of possible data manipulations on the lakehouse, a such situation isn't&amp;nbsp;coherent.&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Oct 2024 08:40:01 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4265235#M4934</guid>
      <dc:creator>pmscorca</dc:creator>
      <dc:date>2024-10-31T08:40:01Z</dc:date>
    </item>
    <item>
      <title>Re: Delete error for a lakehouse table as an evenstream destination</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4266367#M4953</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/719515"&gt;@pmscorca&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If you pause the Eventstream, will this error&amp;nbsp;&lt;SPAN&gt;arise when deleting the data with notebook? In addition, for a single lakehouse which is not the destination of an Eventstream, will this error arise?&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Best Regards,&lt;BR /&gt;Jing&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Nov 2024 03:11:08 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4266367#M4953</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-01T03:11:08Z</dc:date>
    </item>
    <item>
      <title>Re: Delete error for a lakehouse table as an evenstream destination</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4267310#M4975</link>
      <description>&lt;P&gt;Hi, thanks for your reply.&lt;/P&gt;&lt;P&gt;I don't think that this issue regards to the eventstream but it refers to the lakehouse and in a prod scenario I cannot pause the eventstream.&lt;/P&gt;&lt;P&gt;So, I don't understand because it is possible to write a value having more of 100 chars into a field defined as a varchar(100), it is possible to update event rows in the lakehouse, but it isn't possible to delete event rows.&lt;/P&gt;&lt;P&gt;The&amp;nbsp;&lt;SPAN&gt;DELTA_EXCEED_CHAR_VARCHAR_LIMIT error doesn't refer to c&lt;/SPAN&gt;oncurrent operations not possible.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Fri, 01 Nov 2024 18:25:05 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Delete-error-for-a-lakehouse-table-as-an-evenstream-destination/m-p/4267310#M4975</guid>
      <dc:creator>pmscorca</dc:creator>
      <dc:date>2024-11-01T18:25:05Z</dc:date>
    </item>
  </channel>
</rss>

