Discussion

Areteans
AU
Last activity: 5 Jul 2024 22:56 EDT
Solution: Kafka queue processor size issue(max.request.size)
Scenario: When we are trying thto queue the page to queue processor which had attachment stream in it,[the size of the page was becoming more than 5MB], So kafka 'producer' was not able to queue this page to Kafka 'server/broker'. We were getting below error: Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.RecordTooLargeException: The message is 10812412 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
Note:
Configuration contains size in Bytes , so convert MB in Bytes then specify that size in below setup.
[1MB=1000000Bytes]
Approach 1:update prconfig file This approach is better if you have single node. When application is deployed to higher environment you need to make sure that 'prconfig' in that environment is updated. <env name="dnode/kafka/producer/max.message.bytes" value="5000000"/> <env name="dnode/kafka/producer/max.request.size" value="5000000"> <env name="dsm/services/stream/server_properties/max.request.size" value="5000000"/> <env name="dsm/services/stream/server_properties/message.max.bytes" value="5000200"/> <env name="dsm/services/stream/server_properties/replica.fetch.max.bytes" value="5000200"/> <env name="dsm/services/stream/server_properties/replica.fetch.response.max.bytes" value="5000200"/>
Approach 2: Update DSS This aproach is best suited -If you have multiple node environment. because then you dont need to maintain the configuration file for each node, DSS will take care of it accorss all nodes.
1. Kafka Producer DSS Pega-Engine • prconfig/dnode/kafka/producer/max.request.size/default 5000000 Pega-Engine • prconfig/dnode/kafka/producer/max.message.bytes/default 5000000
2. Kafka Server DSS Pega-Engine • prconfig/dsm/services/stream/server_properties/message.max.bytes/default 5000000 Pega-Engine • prconfig/dsm/services/stream/server_properties/max.request.size/default 5000000 Pega-Engine • prconfig/dsm/services/stream/server_properties/replica.fetch.response.max.bytes/default 5000200 Pega-Engine • prconfig/dsm/services/stream/server_properties/replica.fetch.max.bytes/default 5000200
Imp note: message.max.bytes-> should be same in 'Kafka producer' and 'kafka server' replica.fetch.max.bytes-> must be greater than ‘message.max.bytes’ of 'Kafka server' replica.fetch.response.max.bytes-> must be greater than ‘message.max.bytes’ of 'Kafka server'