close error when open is a pipe with no output (and a -w question) 
Author Message
 close error when open is a pipe with no output (and a -w question)

Damn, coming up with good subject lines is tough.

Here is the problem, if I open a filehandle that could potentially
have no output, the close returns false and my or die gets incorrectly
(or at least undesireably called).

Here are the details:

node113% perl -v

This is perl, version 5.003 with EMBED
        built under sunos
        + suidperl security patch

(I get the same result for the same version under Solaris 2.5.1)

Here is code that exhibits the problem:

#!/usr/local/bin/perl

open(TEMP,"grep asdf data |") or die "Open(asdf) error:$!\n";
while(<TEMP>) {
        chomp;  #some meaningless activity

Quote:
}

close(TEMP) or die "Close(asdf) error:$!\n";

open(TEMP,"grep bob data |") or die "Open(bob) error:$!\n";
while(<TEMP>) {
        chomp;  #some meaningless activity

Quote:
}

close(TEMP) or die "Close(bob) error:$!\n";

And the "data" file is:

node113% cat data
asdf

And finally the error message:

node113% perl -w seek.pl
Close(bob) error:No such file or directory

It is not intuititively obvious to me why this should error.  Additionally,
I have other programs that have the same sort of thing that fail with
different error codes.  The programs work fine if I remove the 'or die'
but that seems to be a poor solution.

(The other program does an:

 open(TEMP,"ypcat something | grep something |") or die "$!\n";
 close(TEMP) or die "$!\n";

 And the error is:  Illegal seek.
)

Oh, and I have another question (sorry didn't want to make the subject
line too long).

I hate the Uninitialized variables message I get with -w.  I've read the
thread that came through here a few weeks ago, and the gist seemed to be
to predeclare the variable.  The problem I have is with an associative array.

If I want to build this array and I don't know the elements ahead of time, how
do I get around this?

i.e.  Let's build the array from the following data:

asdf
bob
fred
jane
asdf

#!/usr/local/bin/perl

%array = '';

open(TEMP,'data') or die "$!\n";

while(<TEMP>) {
  chomp;

  $array{$_} +=1;

Quote:
}

close(TEMP) or die "$!\n";

When I do a perl -w I get:

node113% perl -w test.pl
Use of uninitialized value at test.pl line 10, <TEMP> chunk 1.
Use of uninitialized value at test.pl line 10, <TEMP> chunk 2.
Use of uninitialized value at test.pl line 10, <TEMP> chunk 3.
Use of uninitialized value at test.pl line 10, <TEMP> chunk 4.

What I have been doing is running my program with -w until I get rid of
everything but these errors and then stop running with -w, but
that too seems unnecessary.

I read the newsgroup, so feel free to simply post the reply.

(If you have other ideas and don't want to post, you can email them
and I'll summarize if people want).
--

Unix System Administrator, Medtronic Micro-Rel
"Subtlety is the art of saying what you think and getting out of the way
before it is understood."



Mon, 03 May 1999 03:00:00 GMT  
 close error when open is a pipe with no output (and a -w question)

: Oh, and I have another question (sorry didn't want to make the subject
                 ^^^^^^^^^^^^^^^^
: line too long).

On a different subject, should get a different post with a different
Subject:

Else, other folks interested in the -w "Uninitialized variable"
thing will not know that this article discussed it...

: I hate the Uninitialized variables message I get with -w.  I've read the
: thread that came through here a few weeks ago, and the gist seemed to be
: to predeclare the variable.  The problem I have is with an associative array.
: If I want to build this array and I don't know the elements ahead of time, how
: do I get around this?
: i.e.  Let's build the array from the following data:
: asdf
: bob
: fred
: jane
: asdf

: #!/usr/local/bin/perl

: %array = '';
: open(TEMP,'data') or die "$!\n";
: while(<TEMP>) {
:   chomp;

if (defined $array{$_}) }
:   $array{$_} +=1;

Quote:
}

else {
   $array{$_} =1;

Quote:
}

: }
: close(TEMP) or die "$!\n";
: When I do a perl -w I get:
: node113% perl -w test.pl
: Use of uninitialized value at test.pl line 10, <TEMP> chunk 1.
: Use of uninitialized value at test.pl line 10, <TEMP> chunk 2.
: Use of uninitialized value at test.pl line 10, <TEMP> chunk 3.
: Use of uninitialized value at test.pl line 10, <TEMP> chunk 4.

: I read the newsgroup, so feel free to simply post the reply.
  ^^^^^^^^^^^^^^^^^^^^

Cool!

: (If you have other ideas and don't want to post, you can email them
: and I'll summarize if people want).

Cool again!

--
  Tad McClellan,      Logistics Specialist (IETMs and SGML guy)



Tue, 04 May 1999 03:00:00 GMT  
 close error when open is a pipe with no output (and a -w question)


Quote:

>> Here is the problem, if I open a filehandle that could potentially have
>> no output, the close returns false and my or die gets incorrectly (or at
>> least undesireably called).

>Hmm.  I don't, as a rule, check the return value of close for input files,
>only for output.

For pipes, close returns the command exit status --- true if the command was
healthy, false if the command bombed (something like !$?). So I could
reproduce the behavior Jot was describing if I opened "false|" but not if I
opened "true|".

So I think the answer is, if you want the program to die on non-zero (failed)
exit status from the program you're piping, then check the close(). If you
don't care about the exit status of the child program, then just call close()
and don't test whether it succeeds.

-Bennett



Tue, 04 May 1999 03:00:00 GMT  
 close error when open is a pipe with no output (and a -w question)

Quote:


>> Hmm.  I don't, as a rule, check the return value of close for input
>> files, only for output.
> For pipes, close returns the command exit status --- true if the command
> was healthy, false if the command bombed (something like !$?). So I
> could reproduce the behavior Jot was describing if I opened "false|" but
> not if I opened "true|".

Umm...not according to man perlfunc.

     close FILEHANDLE
             Closes the file or pipe associated with the file
             handle, returning TRUE only if stdio successfully
             flushes buffers and closes the system file
             descriptor.  You don't have to close FILEHANDLE if
             you are immediately going to do another open() on
             it, since open() will close it for you.  (See
             open().)  However, an explicit close on an input
             file resets the line counter ($.), while the
             implicit close done by open() does not.  Also,
             closing a pipe will wait for the process executing
             on the pipe to complete, in case you want to look at
             the output of the pipe afterwards.  Closing a pipe
             explicitly also puts the status value of the command
             into $?.

Don't check the return of close.  Check $?.

--



Tue, 04 May 1999 03:00:00 GMT  
 close error when open is a pipe with no output (and a -w question)

Quote:

> Damn, coming up with good subject lines is tough.

That was a fairly good one, though.  Thanks!

Quote:
> Here is the problem, if I open a filehandle that could potentially have
> no output, the close returns false and my or die gets incorrectly (or at
> least undesireably called).

Hmm.  I don't, as a rule, check the return value of close for input files,
only for output.  close returns true if it successfully flushes the I/O
buffers associated with the file handle, and I don't care about that when
I'm reading, only when I'm writing.

That doesn't explain why you're getting the error you're getting, really,
but it may mean you don't have to worry about it.

Quote:
> open(TEMP,"grep bob data |") or die "Open(bob) error:$!\n";
> while(<TEMP>) {
>         chomp;  #some meaningless activity
> }
> close(TEMP) or die "Close(bob) error:$!\n";
> And the "data" file is:
> node113% cat data
> asdf
> And finally the error message:
> node113% perl -w seek.pl
> Close(bob) error:No such file or directory
> It is not intuititively obvious to me why this should error.
> Additionally, I have other programs that have the same sort of thing
> that fail with different error codes.  The programs work fine if I
> remove the 'or die' but that seems to be a poor solution.

As near as I can tell, you actually aren't getting a $! message set, since
the contents appear to be fairly random.  On a Solaris 2.5 machine and on
a Linux machine, I'm getting "Illegal seek" as the error.  Under SunOS,
HP-UX, I get your "No such file or directory."  On AIX, there's no error
message at all.

Quote:
> I hate the Uninitialized variables message I get with -w.  I've read the
> thread that came through here a few weeks ago, and the gist seemed to be
> to predeclare the variable.  The problem I have is with an associative
> array.
> If I want to build this array and I don't know the elements ahead of
> time, how do I get around this?
> while(<TEMP>) {
>   chomp;
>   $array{$_} +=1;
> }

Just use:

        $array{$_}++;

instead and it will do the right thing.  The problem is that += 1 is
translated into $array{$_} + 1, which references an uninitialized
variable.

--



Tue, 04 May 1999 03:00:00 GMT  
 close error when open is a pipe with no output (and a -w question)

Quote:

>Here is the problem, if I open a filehandle that could potentially
>have no output, the close returns false and my or die gets incorrectly
>(or at least undesireably called).

>open(TEMP,"grep asdf data |") or die "Open(asdf) error:$!\n";
>close(TEMP) or die "Close(asdf) error:$!\n";

close on a piped filehandle returns true if the command succeeded
(exited with status 0) and false otherwise.  In either case, $! is
meaningless -- there was no error in this process -- but the exit
status of the command is available in $?, just as if you'd used
system or backquotes.  (This means that when you test the return
value of close on a piped filehandle, you should do something like

    close FILEHANDLE or die "command failed: exit status $?\n";

rather than using $! in your die message.)

Now for your problem: it turns out that grep exits with status 0
(success) if there were any matching lines, and status 1 if there
were no matching lines.  This is to make shell scripting easier:

    if grep string some-file >/dev/null; then
        echo string was found in some-file!
    fi

But that return value isn't very useful to you in this context, so
you probably want to ignore it.  If you want to be really careful,
you could make sure grep didn't return 2 (which would signal a real
error):

    close FILEHANDLE;
    $? <= 1 or die "grep failed: exit status $?\n";

Note that grep's use of exit codes is documented in the manpage; if
you do check the return value from close on a piped filehandle, you
should probably check the manpage for whatever program was piped, to
make sure it doesn't use exit codes for other things than failure.
(There are a few other programs that do this -- for example, popclient
exits with status 1 if you have no new mail.)
--



Wed, 05 May 1999 03:00:00 GMT  
 
 [ 8 post ] 

 Relevant Pages 

1. Perl core dump with open pipe, and no kill/close

2. Pipe errors on close()

3. piping command output to open()

4. opening pipe for input and output

5. PERLFUNC: close - close file (or pipe or socket) handle

6. PERLFUNC: close - close file (or pipe or socket) handle

7. errors on pipe open; strange eval behavior

8. Q: How to prevent error report on bad open of pipe

9. Questions about $%, $=, $-, and write vs open and close

10. piped open error

11. Open pipe not returning error

12. strange error on pipe open

 

 
Powered by phpBB® Forum Software