[rabbitmq-discuss] RabbitMQ QueueingConsumer possible memory leak

Arthur Embleton aembleton at gmail.com
Tue Oct 2 10:03:12 BST 2012


I have the following code to declare a queue:

Connection connection = RabbitConnection.getConnection();
Channel channel = connection.createChannel();
channel.queueDeclare(getQueueName(), false, false, false, null);
consumer = new QueueingConsumer(channel);
channel.basicConsume(getQueueName(), true,consumer);

and the following to get the next Delivery object and process it:

        Delivery delivery = null;
        T queue = null;
        
        //loop over, continuously retrieving messages
        while(true) {
            
            try {
                delivery = consumer.nextDelivery();
                queue = deserialise(delivery.getBody());
                
                process(queue);
                
            } catch (ShutdownSignalException e) {
                logger.warn("Shutodwon signal received.");
                break;
            } catch (ConsumerCancelledException e) {
                logger.warn("Consumer cancelled exception: 
{}",e.getMessage());
                break;
            } catch (InterruptedException e) {
                logger.warn("Interuption exception: {}", e);
                break;
            }
        }

If I run this with a queue containing a large number of objects, after 
approximatelly 2.7 million objects I get an out of memory exception.  I 
found this originally by running it over night with data going in from 
JMeter at a rate ~90/s which at first it is consuming without any trouble, 
but in the morning I noticed a large number in RabbitMQ and an out of 
memory exception on the consumer.  I ran it up again and used the Eclipse 
Memory Analyzer to determine where this memory was being used.  From this I 
can see that the java.util.concurrent.LinkedBlockingQueue that is 
referenced by com.rabbitmq.client.QueueingConsumer is growing and growing 
until it runs out of memory.

Do I need to do anything to tell Rabbit to release resources?

I could increase the heap size but I'm concerned that this is just a short 
term fix and there might be something in my code that could bite me with a 
memory leak a few months into production deployment.
I have also posted this to StackOverflow:  
http://stackoverflow.com/questions/12687368/rabbitmq-queueingconsumer-possible-memory-leak
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.rabbitmq.com/pipermail/rabbitmq-discuss/attachments/20121002/788a75da/attachment.htm>


More information about the rabbitmq-discuss mailing list