Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I have a MySQL CDC stream from a flexible server in Azure.
The write messages aren't coming through to Eventstream and we are getting the following message:
Reason: org.apache.kafka.connect.errors.ConnectException: Unrecoverable exception from producer send callback Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1407136 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration.
Is there any way to change the kafka max.request.size config?
Any help appreciated.
Solved! Go to Solution.
Hi @bw_chec ,
max.request.size
The maximum value of the message that the producer client can send, the default value is 1048576B, 1MB. it is not recommended to blindly modify this parameter, this parameter involves the linkage of some other parameters, such as the message.max.bytes parameter on the broker side, if the broker's message.max.bytes parameter is set to 10, and the max. request.size is set to 20, the producer parameter will report an error when sending a message with a size of 15B.
Therefore, you need to modify the above multiple parameters consecutively to take effect
Modify the following:
message.max.bytes 2M (modified on the broker side)
max.request.size 2M (modified on the client side)
After restarting again, the data is extracted to kafka normally.
For more details, please refer:
Send Large Messages With Kafka | Baeldung
How to send Large Messages in Apache Kafka? (conduktor.io)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @bw_chec ,
max.request.size
The maximum value of the message that the producer client can send, the default value is 1048576B, 1MB. it is not recommended to blindly modify this parameter, this parameter involves the linkage of some other parameters, such as the message.max.bytes parameter on the broker side, if the broker's message.max.bytes parameter is set to 10, and the max. request.size is set to 20, the producer parameter will report an error when sending a message with a size of 15B.
Therefore, you need to modify the above multiple parameters consecutively to take effect
Modify the following:
message.max.bytes 2M (modified on the broker side)
max.request.size 2M (modified on the client side)
After restarting again, the data is extracted to kafka normally.
For more details, please refer:
Send Large Messages With Kafka | Baeldung
How to send Large Messages in Apache Kafka? (conduktor.io)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.