Calling long_list.cgi from within cgi hangs server

Shane H. W. Travis travis at
Wed Oct 27 23:50:50 UTC 2004

I've got a local ISO requirement that we be able to archive bugs once a
project is over. They don't have to be removed from the database, but they
do have to be stored somewhere else in such a way that they can be read by
someone without special software. (Having to re-install a specific version
of Bugzilla qualifies as 'special software', so writing a mysqldump of the
database, a tarball of the current bugzilla, and the binaries for MySql on a
DVD just ain't gonna cut it. Sheesh - the unreasonableness of some people.

For the bugs themselves, it's 'good enough' if they can be read by a
standard browser... so it seemed to me that the output of long_list.cgi
(including the local mods we've made to interleave comments with the output
of show_activity based on timestamps) would be perfect, and I got
them to agree on that too. So far, so good.

I can use 'wget' to call long_list.cgi from the command-line with no
problems at all. It gives me this sort of output:

/usr/bin/wget --output-document=/tmp/bug925.html
           => `/tmp/bug925.html'
Resolving done.
Connecting to[]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 9,815 [text/html]

100%[====================================>] 9,815          9.36M/s    ETA

17:25:01 (9.36 MB/s) - `/tmp/bug925.html' saved [9815/9815]

The output-document looks good -- looks exactly like it would if I pulled up
the page within Bugzilla. That's exactly what I'm after. HOWEVER... If try
the exact same command from within a perl script, the whole database hangs.
Locked. Dead.

Relevant code is:

my $command = "/usr/bin/wget --output-document=$outfile$id
1>/tmp/program.stdout 2>/tmp/program.stderr";

print (STDOUT "Command = $command<br><br>");
    my $exit_status = system( $command );
print (STDOUT "Exit status 1 = $exit_status<br><br>");

The output from the first print command looks identical to the command
above. The command goes off and tries to execute... but then just never
comes back. Furthermore, the database is hooped completely; I can't log into
it, bugzilla stops working, etc. Apache is still running, and I can get into
another database just fine, but this one is locked down tight.

Output of the stderr file shows the following:

[root at fwk-svr]# cat /tmp/program.stderr
[]:80... connected.
HTTP request sent, awaiting response...
Read error (Connection timed out) in headers.

  (try: 2) => `/tmp/bug925.html'
Connecting to[]:80... connected.
HTTP request sent, awaiting response... 500 Internal Server Error
17:20:34 ERROR 500: Internal Server Error.

I've fiddled with everything I can think of to get this to work, and it
won't. Once I run the archive perl code, I'm screwed until I shudown mysqld
and restart it again. (Screwed for this database anyway; other databases
will still work.)

I'm wondering if there's some Apache configuration I'm not aware of
that I need, or if I'm going about this in entirely the wrong way, or
there's a known MySQL bug... or something.  Mostly, I'm hoping someone may
have run across this or heard of it before, or can offer some suggestions...
because I'm out of ideas.

MySql = 3.32.54
Bugzilla = 2.16.7


Shane Travis            | An efficient and a successful administration
travis at    |   manifests itself equally in small as in
Saskatoon, Saskatchewan |   great matters.     -- Winston Churchill

More information about the developers mailing list