Value too large for defined data type 
Author Message
 Value too large for defined data type

I tried to open a file (plain text, 2.8 GB), but go the following error
message:

Value too large for defined data type

What does this mean? What is the solution and/or workaround?

I was using gcc 2.8.1 under Solaris 7.

Thanks.

Y.C. Tao

Sent via Deja.com http://www.*-*-*.com/
Before you buy.



Sun, 22 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

Quote:

> I tried to open a file (plain text, 2.8 GB), but go the following error
> message:

> Value too large for defined data type

> What does this mean?

Probably the size_t returned by your file reading function for the number of
characters read cannot hold 2.8G.
e.g. I have this in limits.h in my (windows OS) compiler:
#define _I32_MAX      2147483647i32 /* maximum signed 32 bit value */ (my
size_t is a 32-bit int i.e. 2GB)

Quote:
> What is the solution and/or workaround?

You will have to read in the file in smaller blocks.
OTOH if it is as you say the _opening_ function and not the _reading_
function that is doing this (as in something in the FILE struct), you will
have to break up the file beforehand into smaller files using OS calls.

\/\/\/*= Martin



Sun, 22 Sep 2002 03:00:00 GMT  
 Value too large for defined data type


Quote:
> You will have to read in the file in smaller blocks.

Clearly this won't work as the internal representation of the file by c will
not be able to hold the correct offset and EOF location _if_ it is using
32-bit unsigned ints for these purposes. Your only hope seems to be to use
shell commands to fragment the file into several smaller files. Another way
would be to compress the file before reading it in and then decompress it in
memory but that's just putting off the evil day when you want to read in a
400 GB file all at once. If your system let you create a file that big in
the first place it ought to let you load  it too, but not necessarily using
standard c.
\/\/\/*= Martin


Sun, 22 Sep 2002 03:00:00 GMT  
 Value too large for defined data type
There are some UNIX commands that actually work well with huge files,
such as more, head, etc. I am wondering how these commands are
implemented in the first place, do these commands have to "open" the
file too?



Quote:



> > You will have to read in the file in smaller blocks.
> Clearly this won't work as the internal representation of the file by
c will
> not be able to hold the correct offset and EOF location _if_ it is
using
> 32-bit unsigned ints for these purposes. Your only hope seems to be
to use
> shell commands to fragment the file into several smaller files.
Another way
> would be to compress the file before reading it in and then
decompress it in
> memory but that's just putting off the evil day when you want to read
in a
> 400 GB file all at once. If your system let you create a file that
big in
> the first place it ought to let you load  it too, but not necessarily
using
> standard c.
> \/\/\/*= Martin

Sent via Deja.com http://www.deja.com/
Before you buy.


Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

comp.lang.c:

Quote:
> I tried to open a file (plain text, 2.8 GB), but go the following error
> message:

> Value too large for defined data type

> What does this mean? What is the solution and/or workaround?

> I was using gcc 2.8.1 under Solaris 7.

> Thanks.

> Y.C. Tao

When did you get this message?  When compiling?  When linking?  At run
time?

Here is the answer:  there is something wrong in your code.  Without
seeing your code nobody can guess what it is.

Jack Klein
--
Home: http://jackklein.home.att.net



Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

Quote:

> When did you get this message?  When compiling?  When linking?  At run
> time?

> Here is the answer:  there is something wrong in your code.  Without
> seeing your code nobody can guess what it is.

He probably got it when compiling.  I've seen these errors when I've tried
to put an unsigned value in a signed one (i.e., there's one less bit to play
with).  For example, 145 is too big for a signed char, although it fits into
an unsigned one.

--
Rich Teer

NT tries to do almost everything UNIX does, but fails - miserably.

The use of Windoze cripples the mind; its use should, therefore, be
regarded as a criminal offence.  (With apologies to Edsger W. Dijkstra)

If it ain't analogue, it ain't music.

Voice: +1 (250) 763-6205
WWW: www.rite-group.com



Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

wrote in comp.lang.c:

Quote:

> > When did you get this message?  When compiling?  When linking?  At run
> > time?

> > Here is the answer:  there is something wrong in your code.  Without
> > seeing your code nobody can guess what it is.

> He probably got it when compiling.  I've seen these errors when I've tried
> to put an unsigned value in a signed one (i.e., there's one less bit to play
> with).  For example, 145 is too big for a signed char, although it fits into
> an unsigned one.

I know he probably got it when compiling, but his post implies he got
it while running the program and attempting to read a text file.  In
any case, nobody can help him without seeing the code which produced
the error.

Jack Klein
--
Home: http://jackklein.home.att.net



Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type
Actually, this is a run-time problem. The input file contains genomic
sequences (that's why it is so huge). When I first tried to take that
file as the input for a standard genomic sequence analysis program
(BLAST), it failed to run. I checked the error log and found that the
program was not able to open the input file. At first I thought that
there might be some error in the format of the input file which BLAST
cannot recognize, so I wrote a small PERL script to make sure the
format is correct. The script runs well on smaller files (even files as
big as 1.7GB), but it failed on the larger input file, and it gives me
the above error message. Then I wrote a simple C program, basically
just open("inputfile", "r"), it compiles well, but when I ran it, again
the same problem. It seems that this is a standard Solaris effor
message, I have been trying to find a way to get around it. YC Tao

Sent via Deja.com http://www.deja.com/
Before you buy.



Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type
Since you also sent this to solaris I'm assuming that you
are running on solaris.  You might try the limit command.
Here's what mine is

dragon 30 % limit
cputime         unlimited
filesize        unlimited
datasize        2097148 kbytes
stacksize       8192 kbytes
coredumpsize    unlimited
vmemoryuse      unlimited
descriptors     64

You might need to set the filesize to unlimited, or try all at
unlimited.

Quote:

> I tried to open a file (plain text, 2.8 GB), but go the following error
> message:

> Value too large for defined data type

> What does this mean? What is the solution and/or workaround?

> I was using gcc 2.8.1 under Solaris 7.

> Thanks.

> Y.C. Tao

> Sent via Deja.com http://www.deja.com/
> Before you buy.

--
|+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|
| "I don't have the chicken pox anymore daddy, see.  They didn't    |
|  like me and flew away!" My daughter, age 3                       |
|+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|


Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

Quote:

>Actually, this is a run-time problem. The input file contains genomic
>sequences (that's why it is so huge). When I first tried to take that
>file as the input for a standard genomic sequence analysis program
>(BLAST), it failed to run. I checked the error log and found that the
>program was not able to open the input file. At first I thought that
>there might be some error in the format of the input file which BLAST
>cannot recognize, so I wrote a small PERL script to make sure the
>format is correct. The script runs well on smaller files (even files as
>big as 1.7GB), but it failed on the larger input file, and it gives me
>the above error message. Then I wrote a simple C program, basically
>just open("inputfile", "r"), it compiles well, but when I ran it, again
>the same problem. It seems that this is a standard Solaris effor
>message, I have been trying to find a way to get around it. YC Tao

If your file is > 2GB, you need to compile your program following the
instructions on the lfcompile man page on Solaris 2.6 & later.

--
________________________________________________________________________

Univ. of California at Berkeley         http://soar.Berkeley.EDU/~alanc/



Mon, 23 Sep 2002 03:00:00 GMT  
 Value too large for defined data type

Quote:

> Actually, this is a run-time problem.

<snippety>

Quote:
>  I checked the error log and found that the program was not able to
>  open the input file.

<snip>

Quote:
> The script runs well on smaller files (even files as big as 1.7GB),
> but it failed on the larger input file, and it gives me the above
> error message. Then I wrote a simple C program, basically just
> open("inputfile", "r"),

You need to use open64 or the flag O_LARGEFILE to open for files above
2GB. See the man-pages for open(2) and lf64(5).
This works at least on Solaris 7 and above. I have no easy access to a
2.6 machine right now so I'm not sure if it will work on that but it
probably will...

HTH,
--
                        /Stefan

Life - the ultimate practical joke



Wed, 25 Sep 2002 03:00:00 GMT  
 Value too large for defined data type
Would anyone happen to have any examples of this actually working?  I'm
working on a similar problem.  I can manage to open the file, but using
operations from string.h (eg strstr) I get a segmentation violation.

noah


Quote:

> > Actually, this is a run-time problem.

> <snippety>

> >  I checked the error log and found that the program was not able to
> >  open the input file.

> <snip>

> > The script runs well on smaller files (even files as big as 1.7GB),
> > but it failed on the larger input file, and it gives me the above
> > error message. Then I wrote a simple C program, basically just
> > open("inputfile", "r"),

> You need to use open64 or the flag O_LARGEFILE to open for files above
> 2GB. See the man-pages for open(2) and lf64(5).
> This works at least on Solaris 7 and above. I have no easy access to a
> 2.6 machine right now so I'm not sure if it will work on that but it
> probably will...

> HTH,
> --
> /Stefan

> Life - the ultimate practical joke



Fri, 22 Nov 2002 03:00:00 GMT  
 
 [ 12 post ] 

 Relevant Pages 

1. defining a new data type to hold larger int 16 digit + sign

2. how can i define a large global structure as type far to bypass 64k limit

3. Data type for representing very large numbers

4. #define MY_INT int in C# (my own lable for data type)

5. User Defined Data Types and Pointers

6. HELP! user-defined data types

7. Returning user defined data types

8. Passing User defined data Type in ATL

9. Define data type of complex in VC++ 5.0.

10. Define data type of complex in VC++ %.0.

11. Question on defining a data type

12. Sample C code to parse Type Length Value (TLV) data

 

 
Powered by phpBB® Forum Software