[rabbitmq-discuss] erlang-client issue, discarding content?
digitalwarfare at gmail.com
Sat Aug 1 03:41:57 BST 2009
So I stumbled across this:
I basically keep the channel object and connection open the entire time so I
do not do the statup stuff over and over, is this a bad idea? Once the
channel has been retrieved, my consumer doesn't deal with it again, it just
keeps accept requests from it. I don't see any global objects of mine
increasing in size so I can only think I am doing something wrong from a
consumer = EventQueueConsumer()
channel = consumer.get_channel()
On Fri, Jul 31, 2009 at 12:33 PM, Suhail Doshi <digitalwarfare at gmail.com>wrote:
> The memory leak is in the consumers, thanks Matthias.
> On Thu, Jul 30, 2009 at 4:37 PM, Matthias Radestock <matthias at lshift.net>wrote:
>> Suhail Doshi wrote:
>>> Yeah definitely has to be, the moment I kill it the memory drops back
>>> down and it is gradually growing, you can even see it gradually growing in
>>> the images.
>> Do you definitely see the rabbit process consuming the memory? How big
>> does it get? And what about CPU usage?
>> The reason I am asking is that killing the server also affects the
>> clients, i.e. it is possible that a *client* is consuming all the memory,
>> and releases it as soon as the server connection is severed.
>> So please check the *per-process* memory and CPU stats.
>> Now, if it really is the server that is eating the memory, please run all
>> the various list_* commands in rabbitmqctl to see whether any of them are
>> showing growth.
>> Finally, try publishing messages without marking them as persistent, and
>> see whether that changes the behaviour.
> Blog: http://blog.mixpanel.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the rabbitmq-discuss