bleak_house 3 tells you your leaks

BleakHouse 3:

  58%: core rails (783 births, 1241 deaths, ratio -0.23, impact 1.66)
  66%: recipe/new/GET (15182 births, 14593 deaths, ratio 0.02, impact 1.77)
  75%: core rails (766 births, 1168 deaths, ratio -0.21, impact 1.60)
  83%: recipe/list/GET (16423 births, 15991 deaths, ratio 0.01, impact 1.64)

65992 births, 66458 deaths.

Tags sorted by immortal leaks:
  recipe/show/GET leaks, averaged over 4 requests:
    5599 String
    80 Array
    2 Regexp
    2 MatchData
    2 Hash
    1 Symbol
  core rails leaks, averaged over 4 requests:
    238 String
    10 Array

Tags sorted by impact * ratio:
   0.0739: recipe/show/GET
   0.0350: recipe/new/GET
   0.0218: recipe/list/GET
  -0.6686: core rails

That’s a Symbol up there; the new BleakHouse walks the sym_tbl as well as the regular heap. We now track the history of every individual object instead of just class counts. This means we can accurately (fingers crossed) identify where lingering objects were spawned.

On the flipside, analyzing the log file is slow (a decent-sized logfile will have hundreds of millions of rows). I wrote a pure-C CSV parser, which helps, and there’s always the “better hardware” answer. I’ve been mainly running it on my Mac Mini; if I use the Opteron 2210 it goes much faster, since the analyzer is CPU-bound.

It doesn’t make pretty graphs anymore but I’m not sure exactly how they would help. It would be easy enough to add them back.

go go go

A gem, not a plugin, because it needs to compile a C extension. First, uninstall the old versions to prevent version problems:

sudo gem uninstall bleak_house -a -i -x


sudo gem install bleak_house

Also, you need to rebuild your ruby-bleak-house binary, even if you already have one. Just run:

bleak --build

The RDoc has updated usage instructions.