[rabbitmq-discuss] problem with new persiter with 1000 topics/queues

alex chen chen650 at yahoo.com
Thu Oct 28 06:03:53 BST 2010


> i was able to increase the memory_high_watermark to 0.8 (6.4GB) and got 50 MB/sec publish rate all the way to 200 GB messages stored.  the broker used 5.9 GB and there was no throttling on publishers.   However, when i started 1000 consumers, the same old errors happen again as described in my first email:  lots of consumers got errors on login or basic_consume:

i increased the timeout of login from 10 sec to 5 minutes, then the consumers could consume the 200 GB of messages without any problems.  the 2.1.1 broker's memory kept below 6.2 GB.  (the memory usage for release 2.0.0 was 10 GB for this test case).  


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.rabbitmq.com/pipermail/rabbitmq-discuss/attachments/20101027/f1da9426/attachment-0001.htm>

More information about the rabbitmq-discuss mailing list