The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a MySQL CDC stream from a flexible server in Azure.
The write messages aren't coming through to Eventstream and we are getting the following message:
Reason: org.apache.kafka.connect.errors.ConnectException: Unrecoverable exception from producer send callback Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1407136 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration.
Is there any way to change the kafka max.request.size config?
Any help appreciated.
Solved! Go to Solution.
Hi @bw_chec ,
max.request.size
The maximum value of the message that the producer client can send, the default value is 1048576B, 1MB. it is not recommended to blindly modify this parameter, this parameter involves the linkage of some other parameters, such as the message.max.bytes parameter on the broker side, if the broker's message.max.bytes parameter is set to 10, and the max. request.size is set to 20, the producer parameter will report an error when sending a message with a size of 15B.
Therefore, you need to modify the above multiple parameters consecutively to take effect
Modify the following:
message.max.bytes 2M (modified on the broker side)
max.request.size 2M (modified on the client side)
After restarting again, the data is extracted to kafka normally.
For more details, please refer:
Send Large Messages With Kafka | Baeldung
How to send Large Messages in Apache Kafka? (conduktor.io)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @bw_chec ,
max.request.size
The maximum value of the message that the producer client can send, the default value is 1048576B, 1MB. it is not recommended to blindly modify this parameter, this parameter involves the linkage of some other parameters, such as the message.max.bytes parameter on the broker side, if the broker's message.max.bytes parameter is set to 10, and the max. request.size is set to 20, the producer parameter will report an error when sending a message with a size of 15B.
Therefore, you need to modify the above multiple parameters consecutively to take effect
Modify the following:
message.max.bytes 2M (modified on the broker side)
max.request.size 2M (modified on the client side)
After restarting again, the data is extracted to kafka normally.
For more details, please refer:
Send Large Messages With Kafka | Baeldung
How to send Large Messages in Apache Kafka? (conduktor.io)
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.