Incomplete file I/O with large executeables/large files 
Author Message
 Incomplete file I/O with large executeables/large files

In writing large perl programs (6000+ lines) for database file
manipulation (not using DBM's) I encountered a problem when
outputting the manipulated files.  Depending on the size of
the executeable, the end of large files often was truncated,
leaving a file with an incomplete last line.  There did not
appear to be a lack of available ram (via pstat) or a lack of
disk space.  The routines to manipulate and print these files
were included in the main as subroutines (external), which
WORKED with smaller input files, but not with larger input files.
When I converted the subroutines to standalone programs called
via "system", everything worked fine?!?!?  Is there an
upper limit to the size of the compiled executeable / input file?

To clarify the problem, the files were often truncated by hundreds of
lines, sometimes stopping in the middle of a line, and always in the
same place relative to the size of the file.  If two lines were removed
at the beginning of the file, two more lines would be printed on the end.

If this a known bug, is there any way around it, or is it just a
limitation (and if so, what is the limit, and what other factors are
involved???)

very odd and frustrating phenom...
Tom Mays, Lockheed Martin Flight Systems

P.S. I'm curious about how many people write complex programs in perl?
I work with text file manipulation, so perl is much easier than C, but
as the programs get more complex (client/server, subroutines, multi-dim
arrays, etc..) I'm running into more and more snags.  My problems may
stem from the fact that I'm still running Perl4.0???  Sorry for being
longwinded, but I have limited access to the Inet!  LAter



Sun, 28 Dec 1997 03:00:00 GMT  
 Incomplete file I/O with large executeables/large files

Quote:

> In writing large perl programs (6000+ lines) for database file
> manipulation (not using DBM's) I encountered a problem when
> outputting the manipulated files.  Depending on the size of
> the executeable, the end of large files often was truncated,
> leaving a file with an incomplete last line.
> There did not appear to be a lack of available ram (via pstat) or a
> lack of disk space.

Perl dies with a message if it's not able to allocate memory so you
don't get weird problems from memory shortage.

Quote:
> The routines to manipulate and print these files
> were included in the main as subroutines (external), which
> WORKED with smaller input files, but not with larger input files.
> When I converted the subroutines to standalone programs called
> via "system", everything worked fine?!?!?  Is there an
> upper limit to the size of the compiled executeable / input file?

Nope. At least not within perl.

Quote:
> To clarify the problem, the files were often truncated by hundreds of
> lines, sometimes stopping in the middle of a line, and always in the
> same place relative to the size of the file.  If two lines were removed
> at the beginning of the file, two more lines would be printed on the end.

Is the file size a multiple of the block size? Perhaps your OS is
imposing a quota. Your shell may have a ulimit command which can be
used to adjust resource limits. On my system ulimit -a says:

        core file size (blocks)  unlimited
        data seg size (kbytes)   unlimited
        file size (blocks)       unlimited
        max memory size (kbytes) unlimited
        stack size (kbytes)      8192
        cpu time (seconds)       unlimited
        pipe size (512 bytes)    8
        open files               64
        virtual memory (kbytes)  2105343

others kinds of limits may apply on other types of systems.

Quote:
> If this a known bug, is there any way around it, or is it just a
> limitation (and if so, what is the limit, and what other factors are
> involved???)

I don't think it's either.

Quote:
> very odd and frustrating phenom...
> Tom Mays, Lockheed Martin Flight Systems

> P.S. I'm curious about how many people write complex programs in perl?
> I work with text file manipulation, so perl is much easier than C, but
> as the programs get more complex (client/server, subroutines, multi-dim
> arrays, etc..) I'm running into more and more snags.  My problems may
> stem from the fact that I'm still running Perl4.0???  Sorry for being
> longwinded, but I have limited access to the Inet!  LAter

I have a ~6500 line perl4 application which routinely does very complex
manipulations on megabytes of structured ascii text.

I suggest that you temporarily change the way the file is written to
use syswrite and check the return status. If you still have problems
and the code that does the writing is small please post it here
(to comp.lang.perl.misc)

Tim.



Mon, 29 Dec 1997 03:00:00 GMT  
 
 [ 2 post ] 

 Relevant Pages 

1. manipulating a large file and splitting in smaller files

2. parsing large DNA files into smaller files

3. Newbie Help: Splitting Large file into smaller files

4. large numbers cause large frustration

5. Large HD under OS/2

6. Problem reading large files (dumb question?)

7. building for large files under HP-UX 11

8. upload large file to IIS3.0...

9. Perl w/large files (Again)

10. Using Perl with large files ....

11. Sorting Very Large Files

12. Large Database file Hashing problem w/PERL

 

 
Powered by phpBB® Forum Software