Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
pmscorca
Post Prodigy
Post Prodigy

Delete error for a lakehouse table as an evenstream destination

Hi,

I've a lakehouse table as a destination for an evenstream.

At regular time intervals a notebook must delete a data portion of this table, that has a field defined as a VARCHAR(100) in this way:

pmscorca_0-1730295121452.png

Respect to this specific field, eventstream writes on the lakehouse table some values with a lenght major of 100 chars.

When the delete rows is executed this error arises:


Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - Py4JJavaError, Error value - An error occurred while calling o322.sql.
: org.apache.spark.sql.delta.schema.DeltaInvariantViolationException: [DELTA_EXCEED_CHAR_VARCHAR_LIMIT] Exceeds char/varchar type length limitation. Failed check: (isnull('mycolumn) OR (length('mycolumn) <= 100)).
at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.getCharVarcharLengthInvariantViolationException(InvariantViolationException.scala:64)
at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:89)
at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException$.apply(InvariantViolationException.scala:112)
at org.apache.spark.sql.delta.schema.DeltaInvariantViolationException.apply(InvariantViolationException.scala)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.CheckDeltaInvariant_12$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields_0_1$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
at org.apache.spark.sql.delta.constraints.DeltaInvariantCheckerExec.$anonfun$doExecute$3(DeltaInvariantCheckerExec.scala:89)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.writeWithIterator(FileFormatDataWriter.scala:117)
at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeTask$1(DeltaFileFormatWriter.scala:440)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1397)
at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.executeTask(DeltaFileFormatWriter.scala:447)
at org.apache.spark.sql.delta.files.DeltaFileFormatWriter$.$anonfun$executeWrite$2(DeltaFileFormatWriter.scala:284)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
at org.apache.spark.scheduler.Task.run(Task.scala:141)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)

I think that an issue about the data schema has to arise both inserting (by eventstream) and deleting operations, and not only for the delete statement.

Any suggests to me, please? Thanks

 

4 REPLIES 4
Anonymous
Not applicable

Hi @pmscorca 

 

I find a thread related to a similar error message, its solutions might be helpful:

DELTA_EXCEED_CHAR_VARCHAR_LIMIT - Databricks Community

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Hi, thanks for your reply.

The DELTA_EXCEED_CHAR_VARCHAR_LIMIT error refers to a specific field defined as VARCHAR(100).

Some related values of the entered events can have a lenght major of 100 chars.

When evenstream writes these values in the lakehouse table none errors occur, but when runs a delete notebook at regular time intervals a such error occurs.

pmscorca_0-1730363739076.png

There is a different behaviour: I expect or always the DELTA_EXCEED_CHAR_VARCHAR_LIMIT error for both inserting and deleting operations or never; I don't expect this error for only one of possible data manipulations on the lakehouse, a such situation isn't coherent.

Anonymous
Not applicable

Hi @pmscorca 

 

If you pause the Eventstream, will this error arise when deleting the data with notebook? In addition, for a single lakehouse which is not the destination of an Eventstream, will this error arise? 

 

Best Regards,
Jing

Hi, thanks for your reply.

I don't think that this issue regards to the eventstream but it refers to the lakehouse and in a prod scenario I cannot pause the eventstream.

So, I don't understand because it is possible to write a value having more of 100 chars into a field defined as a varchar(100), it is possible to update event rows in the lakehouse, but it isn't possible to delete event rows.

The DELTA_EXCEED_CHAR_VARCHAR_LIMIT error doesn't refer to concurrent operations not possible.

Thanks

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors