memory usage in mod_perl
mkanat at bugzilla.org
Tue Jan 16 23:11:38 UTC 2007
On Wed, 17 Jan 2007 07:20:52 +1100 "Bradley Baetz" <bbaetz at acm.org>
> It shouldn't be higher. It should be less because the various perl
> modules are preloaded and shared.
No, it's higher. Without mod_perl, all memory usage is
transient. With mod_perl, once something is loaded into one httpd
child, it stays there forever.
Also, because of copy-on-write, modules aren't really shared.
Also, I think some shared memory stuff that mod_perl needs may be
broken on the RHEL4 kernel.
I asked about this on the mod_perl list, and they said, "No,
your memory usage looks totally normal."
> > So b.m.o ran fine for a week or two after enabling mod_perl, but
> > it's had to be rebooted twice in the last 3 days because it plain
> > ran out of memory. It's got 4GB on it with 2GB swap.
The usual solution for this is to set MaxRequestsPerChild.
> > If anyone has any ideas where to look or how to debug it, that'd be
> > great, but it kinda smells like something's leaking.
Nothing is leaking. Well, except the primary design of
perl. :-( Here's the problem:
When perl allocates memory for a variable, it doesn't release
it. Ever. Until it dies.
But in mod_perl, it never dies! So, say somebody loads a list
of 10,000 bugs. The memory for that 10,000 bugs is never released by
the perl interpreter, even though it destroys the variable when the
If you know what specific variable is causing the problem, you
can "undef $var" before your subroutine exits, and that can handle it.
But you have to specifically undef it for perl to release it.
Ideally what we need is something that can walk the whole perl
memory structure and manually release the memory of unused variables.
But I haven't found such a thing, and I don't know enough about perl
internals to write such a thing.
Something like Devel::Gladiator might be a place to start, for
Competent, Friendly Bugzilla Services. And Everything Else, too.
More information about the developers