[rabbitmq-discuss] Message is delivered to all queue consumers when rabbitmq crashes

rasadoll rasadoll at gmail.com
Wed Oct 19 21:02:11 BST 2011


I have multiple queue consumers running at the same time. They all
listen to one single queue. Messages are sent to the default exchange
with queue name as the routing key. When one of the consumers is busy
processing a message, before acking the message if RabbitMQ service
crashes (simply by stopping the service), the message is sent to all
other consumers immediately which results in processing the same
message multiple times concurrently. Is this the right behavior for

Here is my queue design:

Map<String, Object> args = new HashMap<String, Object>();
args.put("x-ha-policy", "all");
channel.queueDeclare(JOB_EXEC_QUEUE, true, false, false, args);

I use durable HA queue but I am not running in a clustered
Message publishing:

channel.basicPublish("", JOB_EXEC_QUEUE,
MessageProperties.PERSISTENT_BASIC, message.getBytes());
Consumer Code:

QueueingConsumer consumer = new QueueingConsumer(channel);
channel.basicConsume(JOB_EXEC_QUEUE, consumer);

QueueingConsumer.Delivery delivery;
while (true) {
        try {
          delivery = consumer.nextDelivery();
        } catch (InterruptedException ie) {

// process the message


I stop the service when one consumer is in the process message phase.

Am I doing something wrong? Is this an expected behavior for RabbitMQ?


More information about the rabbitmq-discuss mailing list