[rabbitmq-discuss] Large Message Support / Stability

Alexander Schatten alexanderschatten at gmx.at
Tue Jul 30 21:25:38 BST 2013


thanks for that comment: but this is just one 25MB message every 5 min. I am rather worried what happens if a larger number of such messages hit the broker in a shorter period of time; this could potentially lead to out of memory issues on the broker side?

Also: do you have persistent queues? 


On 30.07.2013, at 22:20, François Beausoleil <francois at teksol.info> wrote:

> Hi,
> 
> I have never had issues exchanging 25 MiB messages. That 25 MiB message is a heartbeat that contains the state of the world. All my consumers receive and process that message every 5 minutes. The only issue I've had is consumers running out of memory and not recovering gracefully, but that's outside the scope of RabbitMQ.
> 
> Hope that helps!
> François
> 
> Le 2013-07-30 à 16:06, Alexander Schatten a écrit :
> 
>> I am considering to use RabbitMQ as messaging platform for a data warehouse (today we would probably call it "Big Data") type application.
>> 
>> The principle issue is, that we receive data from a large variety of sources; partly batch updates, partly events from sensors and the like. The idea is to have service interfaces for different data sources to the "outside" that accept the data packages and events, enrich it with metadata etc. and then dump it on RabbitMQ queues and topics.
>> 
>> Consumers are essentially processing units that do statistical analysis, error/warning trigger and storage modules that write raw and aggregated data into databases (PostgreSQL and most likely MongoDB).
>> 
>> Now, certain messages, e.g. sensor events will be rather small in size. Others like batch updates from ERP and CRM systems or containing documents might be larger; I suppose several MB up to 100 MB. I have not used messaging in such a context, but what I heard is, that other message brokers tend to have problems with large messages or start to behave erratic.
>> 
>> so my concrete question: has anyone experience with such a use case? 
>> 
>> alternatively it would be possible, to write the (large) payload immediately (at the service interface) into e.g. MongoDB and only put the reference/ID in the message. However, this would break the decoupling to a certain extent as all consumers need access to MongoDB or to a REST interface that serves the payload. Also message filtering and content based routing is limited in the latter case.
>> 
>> Would appreciate any comments on this issue.
>> 
>> Kind regards,
>> 
>> 
>> Alexander Schatten
>> 
>> 
>> --
>> ==========================================
>> Dr. Alexander Schatten
>> ==========================================
>> http://www.schatten.info
>> http://sichten.blogspot.com
>> Follow me at 
>> https://alpha.app.net/alex_buzz
>> http://twitter.com/alex_buzz
>> ==========================================
>> "Die Grenzen meiner Sprache
>> bedeuten die Grenzen meiner Welt.", Wittgenstein
>> 
>> _______________________________________________
>> rabbitmq-discuss mailing list
>> rabbitmq-discuss at lists.rabbitmq.com
>> https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss
> 
> _______________________________________________
> rabbitmq-discuss mailing list
> rabbitmq-discuss at lists.rabbitmq.com
> https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss

--
==========================================
Dr. Alexander Schatten
==========================================
http://www.schatten.info
http://sichten.blogspot.com
Follow me at 
https://alpha.app.net/alex_buzz
http://twitter.com/alex_buzz
==========================================
"Die Grenzen meiner Sprache
bedeuten die Grenzen meiner Welt.", Wittgenstein



More information about the rabbitmq-discuss mailing list