open()s, etc. that spawn other processes 
Author Message
 open()s, etc. that spawn other processes

All-

I am trying to script the upgrade of our Sybase ASE servers (since we have
a boatload of them), and am running into some problems.  I am running perl
5.005_02 on HP-UX 10.20.  The problem is that when I open processes, since
they spawn other processes, the script is{*filter*}, and the children appear
as <defunct> in a ps listing.  I have tried the following, with the
following results (much of the interesting stuff is going to STDERR in the
child process, so I have to capture that as well):
[All calls are followed by die() calls in the event of failure in the
actual scripts.]

1.      $ret = qx [ /path/to/a/script 2>&1 ]; print $ret;
        - The parent hangs, no output is given, and when I ctrl-C the parent, the
child dies.
2.      $ret = qx [ /path/to/a/script 2>&1 & ]; print $ret;
        - The parent hangs, no output is given, and when I ctrl-C the parent, the
child lives.
3.      $prog = qq [ /path/to/a/script ];
        open(FH, "$prog 2>&1 |");
        while (<FH>) {
                print;
        }
        close(FH);
        - The parent hangs, output is given until the end, and when I ctrl-C the
parent, the child dies.
4.      $prog = qq [ /path/to/a/script ];
        open(FH, "$prog 2>&1 & |");
        while (<FH>) {
                print;
        }
        close(FH);
        - The parent hangs, output is given until the end, and when I ctrl-C the
parent, the child lives.

I also tried fork + waitpid, but that hung as well, even though the child
finished, and I tried fork + exec for a child that I didn't need to see the
output of, but I got an exec error saying that it couldn't replace itself
with the new script.  I finally fixed that by doing a system() that closed
STDOUT, STDERR, STDIN, and backgrounded itself.  That works well for the
ones that I don't need to read the output from, but does not suit my
purposes for the other.

I can't change the scripts that I'm calling, since they are standard Sybase
server upgrade scripts, and it would be nice if the script did not hang.
Does anyone have a better way of doing this?  Hitting my head against the
wall has not helped.  I looked in APP, the Camel, the Blue Llama, the Ram,
and the FAQs, to no avail.  Maybe the information is in there and I am too
inexperienced to see it, but they didn't help resolve this.  Any help
appreciated.  Thanks.

-JEV
--

"Go not unto the Usenet for advice, for you will be told both yea and nay (and
quite a few things that just have nothing at all to do with the question)."
(Seen in a .sig somewhere.)



Sat, 04 Aug 2001 03:00:00 GMT  
 open()s, etc. that spawn other processes

[Mailed and posted.]

Quote:
> 1. $ret = qx [ /path/to/a/script 2>&1 ]; print $ret;
>    - The parent hangs, no output is given,

This I can't explain without knowing what's going on in your script.
For example, your description is consistent with the script looking
like this:

        sleep 100000;

So maybe{*filter*} is the correct behavior.  Or perhaps it is waiting
for input from somewhere, or waiting to optain a lock or a semaphore;
who can tell?  But:

Quote:
> and when I ctrl-C the parent, the child dies.

That's what's supposed to happen.  They're in the same process group.

Quote:
> 2. $ret = qx [ /path/to/a/script 2>&1 & ]; print $ret;
>    - The parent hangs,

Maybe you thought that the & would prevent this, but it doesn't,
because qx[...] tells Perl to wait for the output, and that is what it
is giong to do, & or no &.

Quote:
> when I ctrl-C the parent, the child lives.

Background processes are put into a different process group.  If you
were using the shell, and you ran two commands:

        sleep 100&
        sleep 200

and then typed control-C, you would expect it to kill the foreground
one, but not the background one.  Wouldn't you?  The same thing is
happening in your program.

Quote:
> 3. open(FH, "$prog 2>&1 |");
>    while (<FH>) {
>    - The parent hangs, output is given until the end, and when I
>         ctrl-C the parent, the child dies.

Now it sounds as though the child is not closing STDOUT or is not
closing STDERR.  (It isn't exiting, either, as this would close both
streams.)  But again, the problem is most likely in the auxiliary
script, and without knowing what it is doing it is impossible to
diagnose.

Quote:
> I also tried fork + waitpid, but that hung as well, even though the child
> finished,

See, that's the part that puzzles me.  waitpid hung, but you know that
the child finished.  How do you know that the child finished?
Normally, blocking in waitpid is very strong evidence that the child
did *not* finish, so unless you present the missing evidence, it is
hard to avoid the conclusion that you are mistaken.

One thing you might try:

        system("/path/to/script 2>&1 >OUTPUT_FILE &");

Then at least your main program can continue on with the other things
it needs to do while it is waiting for the scripts to complete.  It
can read output (or partial output) from the OUTPUT_FILE whenever it
wants.  

But actually, I would say you are trying to solve the wrong problem.
You are asking what you can do to make Perl finish collecting the
output and continue.  But really the problem is that Perl cannot
finish collecting the output until the child script finishes writing
it, and it appears that the child never finishes writing.  If the
child is not exiting, you need to investigate why that is, and fix the
problem at the root.  Otherwise it is like the fellow with the kittens
hitched to his carriage, who thinks that his problem is that his whip
is too small.  You can whip Perl all you want, but it can't drag an
end-of-file condition out of a script that refuses to deliver one.

The other possibility is that there is a real bug in Perl.  I would be
reluctant to suppose this from your description of the problem so far.



Sat, 04 Aug 2001 03:00:00 GMT  
 open()s, etc. that spawn other processes

Quote:
> I am trying to script the upgrade of our Sybase ASE servers (since we have
> a boatload of them), and am running into some problems.  I am running perl
> 5.005_02 on HP-UX 10.20.  The problem is that when I open processes, since
> they spawn other processes, the script is{*filter*}, and the children appear
> as <defunct> in a ps listing.  I have tried the following, with the
> following results (much of the interesting stuff is going to STDERR in the
> child process, so I have to capture that as well):
> [All calls are followed by die() calls in the event of failure in the
> actual scripts.]

I'm not entirely sure what you mean by "hang"; when using qx//, for the
parent to "hang" is not necessarily a bug in the parent -- more
accurately, the parent waits for the child to finish in order to harvest
its output.  Presumably you are interested in waiting for the child to
finish and harvesting its output, otherwise you wouldn't be using qx//.

Quote:
> 1. $ret = qx [ /path/to/a/script 2>&1 ]; print $ret;
>    - The parent hangs, no output is given, and when I ctrl-C the parent,
>         the child dies.

This seems quite straightforward; the only slight oddity is the
redirection of the child's stderr.  Does it work if you take out the
"2>&1"?  How about if you just run /path/to/a/script from a shell prompt
-- does it output some stuff and then terminate?  If so, then that's the
right kind of program to run with qx//; if it has some other execution
model (run for a long time with occasional output, detach to background,
wait for user input, etc.) then qx// is probably not the right thing to
use.

Quote:
> 2. $ret = qx [ /path/to/a/script 2>&1 & ]; print $ret;
>    - The parent hangs, no output is given, and when I ctrl-C the parent,
> the child lives.

It really doesn't make sense to try to background something that you're
running with qx//.  However, it seems to work for me exactly the same as
if the & weren't there... perhaps Perl catches this and suppresses the
'&'?  Weird -- it should return an empty string (I think).

Quote:
> 3. $prog = qq [ /path/to/a/script ];
>    open(FH, "$prog 2>&1 |");
>    while (<FH>) {
>            print;
>    }
>    close(FH);
>    - The parent hangs, output is given until the end, and when I ctrl-C the
> parent, the child dies.

Now I'm really not sure what you mean by "hang": you say the parent
hangs, then you imply that it outputs lines read from <FH>.  Which is
it?  Again, this looks sensible: you're just doing explicitly what qx//
makes implicit in your first try.

Quote:
> 4. $prog = qq [ /path/to/a/script ];
>    open(FH, "$prog 2>&1 & |");
>    while (<FH>) {
>            print;
>    }
>    close(FH);
>    - The parent hangs, output is given until the end, and when I ctrl-C the
> parent, the child lives.

And this seems to duplicate your #2, again making explicit what qx//
makes implicit.

I think you'll have to do a better job of explaining what you mean by
"hang" -- what do you *expect* to happen, and what *does* happen.  If
feasible, try to explain what happens when you run one of these cases,
and what you expect to happen (kind of hard to illustrate with
multi-process programs, but give it a try).

Also, you might want to make things concrete by showing code that runs
an actual program that generates actual output, and is widely available.
'ls' is a good choice -- it's pretty much guaranteed to run for a short
time and generate some output, hence is a good candidate for qx//.  ;-)

        Greg
--

Corporation for National Research Initiatives    
1895 Preston White Drive                      voice: +1-703-620-8990 x287
Reston, {*filter*}ia, USA  20191-5434               fax: +1-703-620-0913



Sat, 04 Aug 2001 03:00:00 GMT  
 open()s, etc. that spawn other processes
On Tue, 16 Feb 1999 15:27:16 -0600 (CST),

Quote:
>#######################
>  sybase 10532 10530  0 14:50:45 ttyp1     0:00 <defunct>
>  sybase 10530  9853  0 14:50:44 ttyp1     0:00 central_1192_upgrade.spl.open -w ./central_1192_upgrade.spl.
>  sybase 10544 10543  0 14:53:18 ttyp1     0:49 /var/opt/backup/sybdumps/sybase11/bin/dataserver -sDS_CENTRA
>  sybase 10543     1  0 14:53:18 ttyp1     0:00 sh -c /var/opt/backup/sybdumps/sybase11/bin/dataserver -sDS_
>#######################

>PID 10530 is my script.  It calls PID 10532, which calls PID 10543.  After
>all the output is supposed to be done, the ps output above is how the
>machine looks.  It looked like closing STDIN might have helped, but that
>turned out not to be the case.
...
>I'm not sure what I need to do to make sure that it comes back.  As
>far as I can tell, it has done everything that it is going to do,
>and it should be exitting nicely now.
...
>Now, I am probably wrong, but I would expect that if the child is in a
><defunct> state, it has completed, and is waiting for the parent to tell
>it to go away.  I hope I gave a more reasonable ammount of information
>this time. *g*

Process 10532 itself exited nicely, but before it did it spawned
processes 10543 and 10544, which shared its stdout.  The qx// in
your perl program is waiting to read EOF from the stdout of the
pipe that it created for its child, but since there are still two
processes which _might_ (in principle) still have some output to
write down that pipe, the OS will not give your perl process an
EOF indication yet.

So, since you don't have enough control over the child to be able
to have it close/redirect its stdout before it calls system(),
I think you will need to work around this you by doing qx//'s
work by hand, with some extra magic: do the pipe/fork/exec stuff
as normal, but do a select call before each read.  Have the select
call timeout after some suitable interval, and attempt a waitpid
on your child-of-immediate-interest.  If the child is dead, then
close the input pipe, even though you have not seen EOF.

                --Ken Pizzini



Sat, 11 Aug 2001 03:00:00 GMT  
 
 [ 4 post ] 

 Relevant Pages 

1. spawning jobs with 'open'...

2. couldn't spawn child process

3. Spawn a new process in Perl?

4. remote spawning of processes in perl

5. coudn't spawn child process

6. Large environment stops processes spawning

7. Spawning new DOS process from within browser.

8. Spawn a Web process

9. spawning multiple processes in perl win32

10. Spawning a separate process

11. dialog with spawned process?

 

 
Powered by phpBB® Forum Software