[rabbitmq-discuss] RabbitMQ QueueingConsumer possible memory leak

Simon MacMullen simon at rabbitmq.com
Tue Oct 2 11:16:42 BST 2012


Bear in mind that the QueueingConsumer is there to buffer messages in 
the client before you call nextDelivery().

You haven't told the broker to limit the number of unacked messages to 
send you, so it hasn't. I suspect therefore the broker is sending you 
messages faster than you are calling nextDelivery() - and they're 
getting stored in your LinkedBlockingQueue.

A solution is to invoke Channel.basicQos(number), and use explicit acks.

Cheers, Simon

On 02/10/12 10:03, Arthur Embleton wrote:
> I have the following code to declare a queue:
>
> Connection connection = RabbitConnection.getConnection();
> Channel channel = connection.createChannel();
> channel.queueDeclare(getQueueName(), false, false, false, null);
> consumer = new QueueingConsumer(channel);
> channel.basicConsume(getQueueName(), true,consumer);
>
> and the following to get the next Delivery object and process it:
>
> Delivery delivery = null;
> T queue = null;
>
> //loop over, continuously retrieving messages
> while(true) {
>
> try {
> delivery = consumer.nextDelivery();
> queue = deserialise(delivery.getBody());
>
> process(queue);
>
> } catch (ShutdownSignalException e) {
> logger.warn("Shutodwon signal received.");
> break;
> } catch (ConsumerCancelledException e) {
> logger.warn("Consumer cancelled exception: {}",e.getMessage());
> break;
> } catch (InterruptedException e) {
> logger.warn("Interuption exception: {}", e);
> break;
> }
> }
>
> If I run this with a queue containing a large number of objects, after
> approximatelly 2.7 million objects I get an out of memory exception. I
> found this originally by running it over night with data going in from
> JMeter at a rate ~90/s which at first it is consuming without any
> trouble, but in the morning I noticed a large number in RabbitMQ and an
> out of memory exception on the consumer. I ran it up again and used the
> Eclipse Memory Analyzer to determine where this memory was being used.
>  From this I can see that the java.util.concurrent.LinkedBlockingQueue
> that is referenced by com.rabbitmq.client.QueueingConsumer is growing
> and growing until it runs out of memory.
>
> Do I need to do anything to tell Rabbit to release resources?
>
> I could increase the heap size but I'm concerned that this is just a
> short term fix and there might be something in my code that could bite
> me with a memory leak a few months into production deployment.
>
> I have also posted this to StackOverflow:
> http://stackoverflow.com/questions/12687368/rabbitmq-queueingconsumer-possible-memory-leak
>
>
>
> _______________________________________________
> rabbitmq-discuss mailing list
> rabbitmq-discuss at lists.rabbitmq.com
> https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss


-- 
Simon MacMullen
RabbitMQ, VMware


More information about the rabbitmq-discuss mailing list