Problem is, it only happens every 5+ hours and not sure how to duplicate it but here is a crash dump:<div><br></div><div><div>=erl_crash_dump:0.1</div><div>Sat Oct 3 20:18:09 2009</div><div>Slogan: eheap_alloc: Cannot allocate 6960012640 bytes of memory (of type "heap"). </div>
<div>System version: Erlang R13B01 (erts-5.7.2) [source] [64-bit] [smp:4:4] [rq:4] [async-threads:0] [hipe] [kernel-poll:false]</div><div>Compiled: Tue Jun 23 19:56:26 2009</div><div>Atoms: 7552</div><div>=memory</div><div>
total: 11040213416</div><div>processes: 10989180488</div><div>processes_used: 10989135536</div><div>system: 51032928 </div><div>atom: 507961</div><div>atom_used: 494181</div><div>binary: 43474800 </div><div>code: 4668039</div>
<div>ets: 311600</div><div>=hash_table:atom_tab</div><div>size: 4813</div><div>used: 3776</div><div>objs: 7552</div><div>depth: 8</div><div>=index_table:atom_tab</div><div>size: 8192</div><div>limit: 1048576</div><div>entries: 7552</div>
<div>=hash_table:module_code</div><div>size: 97</div><div>used: 71</div><div>objs: 119</div><div>depth: 4</div><div>=index_table:module_code</div><div>size: 1024</div><div>limit: 65536</div><div>entries: 119</div><div>=hash_table:export_list</div>
<div>size: 2411</div><div>used: 1770</div><div>objs: 3269</div><div>depth: 8</div><div>=index_table:export_list</div><div>size: 4096</div><div>limit: 65536</div><div>entries: 3269</div><div>=hash_table:secondary_export_table</div>
<div>size: 97</div><div><div>used: 0</div><div>objs: 0</div><div>depth: 0</div><div>=hash_table:process_reg</div><div>size: 47</div><div>used: 31</div><div>objs: 43</div><div>depth: 4</div><div>=hash_table:fun_table</div>
<div>size: 397</div><div>used: 301</div><div>objs: 566</div><div>depth: 6</div><div>=hash_table:node_table</div><div>size: 11</div><div>used: 1</div><div>objs: 1</div><div>depth: 1</div><div>=hash_table:dist_table</div><div>
size: 11</div><div>used: 1</div><div>objs: 1</div><div>depth: 1</div><div>=allocated_areas</div><div>sys_misc: 80890</div><div>static: 991232</div><div>atom_space: 98328 84868</div><div>atom_table: 104153</div><div>module_table: 9084</div>
<div>export_table: 52172</div><div>register_table: 468</div><div>fun_table: 3266</div><div>module_refs: 2048</div><div>loaded_code: 4228461</div><div>dist_table: 507</div><div>node_table: 227</div><div>bits_bufs_size: 0</div>
<div>bif_timer: 80192</div><div>link_lh: 0</div><div>proc: 75296 39952</div><div>atom_entry: 305480 305160</div><div>export_entry: 316248 315192</div><div>module_entry: 7784 7720</div><div>reg_proc: 2480 1800</div><div><div>
atom_entry: 305480 305160</div><div>export_entry: 316248 315192</div><div>module_entry: 7784 7720</div><div>reg_proc: 2480 1800</div><div>monitor_sh: 4400 512</div><div>nlink_sh: 11368 6328</div><div>fun_entry: 51328 50096</div>
<div>db_tab: 6624 5640</div><div>driver_event_data_state: 56 56</div><div>driver_select_data_state: 1352 200</div><div>=allocator:sys_alloc</div><div>option e: true</div><div>option m: libc</div><div>option tt: 131072</div>
<div>option tp: 0</div><div>=allocator:temp_alloc[0]</div><div>versions: 2.1 2.2</div><div>option e: true</div><div>option t: false</div><div>option ramv: false</div><div>option sbct: 524288</div><div>option asbcst: 4145152</div>
<div>option rsbcst: 90</div><div>option rsbcmt: 80</div><div>option rmbcmt: 100</div><div>option mmbcs: 131072</div><div>option mmsbc: 256</div><div>option mmmbc: 10</div><div>option lmbcs: 10485760</div><div>option smbcs: 1048576</div>
<div>option mbcgs: 10</div><div>option mbsd: 3</div><div>option as: gf</div><div>mbcs blocks: 0 136 136</div><div>mbcs blocks size: 0 46568 46568</div><div>mbcs carriers: 1 1 1</div><div>mbcs mseg carriers: 0</div><div>mbcs sys_alloc carriers: 1</div>
<div>mbcs carriers size: 131112 131112 131112</div><div>mbcs mseg carriers size: 0</div><div>mbcs sys_alloc carriers size: 131112</div><div>sbcs blocks: 0 0 0</div><div>sbcs blocks size: 0 0 0</div><div>sbcs carriers: 0 0 0</div>
<div>sbcs mseg carriers: 0</div><div><br></div></div></div><div><div><br><div class="gmail_quote">On Sun, Oct 4, 2009 at 2:17 PM, Ben Hood <span dir="ltr"><<a href="mailto:0x6e6562@gmail.com">0x6e6562@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">Suhail,<br>
<div class="im"><br>
On Sun, Oct 4, 2009 at 8:43 PM, Suhail Doshi <<a href="mailto:suhail@mixpanel.com">suhail@mixpanel.com</a>> wrote:<br>
> Any ideas why my producer would suddenly jump and consume all the RAM<br>
> available? I am getting a serious amount of volume of items hitting the<br>
> queue.<br>
<br>
</div>In general, a producer is a client process running outside of<br>
RabbitMQ, so it is difficult to see how RabbitMQ is affecting the<br>
memory consumption of this process. Maybe you can post a cut down<br>
version of your producer application that reproduces the symptoms.<br>
<font color="#888888"><br>
Ben<br>
</font></blockquote></div><br><br clear="all"><br>-- <br><a href="http://mixpanel.com">http://mixpanel.com</a><br>Blog: <a href="http://blog.mixpanel.com">http://blog.mixpanel.com</a><br>
</div></div></div>