Why "out of memory"? 
Author Message
 Why "out of memory"?

I'm still trying to determine why my perl 5.6 program dies with an "Out of
memory!" error. It's creating some rather large data structures (probably
around 1.5GB if it completed). Its memory usage is about 965 MB when it
dies, even though another program on the system (HP 5000, 4GB real memory,
4GB swap) is running at just over 1GB. I checked all the kernel values and
they are way above these numbers. I used the environment variable
PERL_DEBUG_MSTATS to get perl to dump some memory statistics. Its output is
below. Does anyone know how to read the results? Does it point to any
particular kernel parameter that it is exceeding? In the past, increasing
maxdsize fixed up programs that had memory problems. Now that it is about
3.2GB, I'm surprised that a 965MB program has problems. Thanks for any help.

Quote:
>From /stand/system (kernel parameters):

* Tunable parameters

STRMSGSZ        65535
default_disk_ir 1
maxdsiz         3221225472
maxdsiz_64bit   3800000000
maxfiles        1024
maxfiles_lim    2048
maxssiz         83570688
maxssiz_64bit   83570688
maxswapchunks   4800
maxtsiz         1073741824
maxuprc         150
maxusers        250
ninode          5000
nproc           3000
nstrpty         60

Quote:
>From "sysdef" :

NAME                      VALUE       BOOT        MIN-MAX        UNITS
FLAGS
maxdsiz                  766029          -          0-655360     Pages   -
maxdsiz_64bit            927734          -        256-1048576    Pages   -
maxfiles                   1024          -         30-2048               -
maxfiles_lim               2048          -         30-2048               -
maxssiz                   20403          -          0-655360     Pages   -
maxssiz_64bit             20403          -        256-1048576    Pages   -
maxswapchunks              4800          -          1-16384              -
maxtsiz                  262144          -          0-655360     Pages   -
maxtsiz_64bit            262144          -        256-1048576    Pages   -
maxuprc                     150          -          3-                   -
maxvgs                       10          -           -                   -

Version :

trj5000c  (gkd) 9:56am ~ [504] % /trj5000a/users1/gnu/perl5.6/bin/perl -v

This is perl, v5.6.0 built for 9000/777-hpux

Copyright 1987-2000, Larry Wall

(just started program . . .)

Memory statistics :

Memory allocation statistics after compilation: (buckets 4(4)..16376(16384)
   32752 free:   191    74    34    23     2   1   3     2   1 0 0 1
              396    94    42    14     8
   80732 used:    64    53   470    70    14   7   9    12   2 2 1 0
              115    76   213   238    17
Total sbrk(): 117984/18:166. Odd ends: pad+heads+chain+tail:
1248+1204+0+2048.

(program running along . . .)

Read 1740000 lines
Read 1750000 lines
Read 1760000 lines
Out of memory!
Memory allocation statistics after execution:   (buckets
4(4)..8392696(8388608)
   11600 free:   152    37    11    28     5   6   3     0   0 0 0 0 0 0 0 0
0 0 0 0 0
              146     4    77    16    10
975739352 used: 877558 1285076 3356944 693473 107611 1722 4781 420038   3 1
1 3 0 0 0 0 0 1 1 0 1
            1281442 6823966 285948 4420274 452265
Total sbrk(): 987946208/312:459. Odd ends: pad+heads+chain+tail:
1248+12194008+0+0.



Tue, 14 Oct 2003 21:35:22 GMT  
 Why "out of memory"?

Quote:

>  my perl 5.6 program dies with an "Out of
> memory!" error. It's creating some rather large data structures (probably
> around 1.5GB if it completed). Its memory usage is about 965 MB when it
> dies

When perl resizes arrays it always increases the allotment by a
factor of two.  Is the data in arrays? What happens if you rewrite
it to use linked lists?

--

                           Vostok lake is The Land That Time Forgot



Sat, 18 Oct 2003 04:49:06 GMT  
 Why "out of memory"?

Quote:

>>  my perl 5.6 program dies with an "Out of
>> memory!" error. It's creating some rather large data structures (probably
>> around 1.5GB if it completed). Its memory usage is about 965 MB when it
>> dies
> When perl resizes arrays it always increases the allotment by a
> factor of two.

That's not quite true. Hashes double. Arrays increase to a size equal
to ((orig_size * .2) + new_top_element). (Though I was under the doubling
misapprehension for a while too...)

                                        Dan



Sat, 18 Oct 2003 09:49:35 GMT  
 Why "out of memory"?


Quote:

>>  my perl 5.6 program dies with an "Out of
>> memory!" error. It's creating some rather large data structures (probably
>> around 1.5GB if it completed). Its memory usage is about 965 MB when it
>> dies

>When perl resizes arrays it always increases the allotment by a
>factor of two.  Is the data in arrays? What happens if you rewrite
>it to use linked lists?

Linked lists?

unix% perldoc -q linked
=head1 Found in /usr/local/lib/perl5/5.00502/pod/perlfaq4.pod

=head2 How do I handle linked lists?

In general, you usually don't need a linked list in Perl, since with
regular arrays, you can push and pop or shift and unshift at either end,
or you can use splice to add and/or remove arbitrary number of elements at
arbitrary points.  Both pop and shift are both O(1) operations on perl's
dynamic arrays.  In the absence of shifts and pops, push in general
needs to reallocate on the order every log(N) times, and unshift will
need to copy pointers each time.

If you really, really wanted, you could use structures as described in
L<perldsc> or L<perltoot> and do just what the algorithm book tells you
to do.
--
See http://www.inwap.com/ for PDP-10 and "ReBoot" pages.



Sat, 18 Oct 2003 13:16:53 GMT  
 Why "out of memory"?
I have 2 files I'm trying to compare. I load each file into its own hash
with the following structure. I wouldn't be surprised if this is not the
densest structure, but I still think my kernel parameters allow data sizes
up to about 3.8GB. I can probably rewrite to use arrays of the values
instead of individual containers for them. If that doesn't work, I might try
a linked list approach, although at that point I'll probably give up. Thanks
for the info.

poleres{$net} = {   (entries exist for 300,000 nets)

                  WC => {

                          PM => [ $opin, $cnear, $r, $cfar ]

                          PR => { $ipin => [$p1,$p2,$p3,$r1,$r2,$r3] }
(multiple PRs per net)

                        }

                  BC => {

                          (same as under WC)

                        }

                }

Quote:



>Linked lists?

>unix% perldoc -q linked
>=head1 Found in /usr/local/lib/perl5/5.00502/pod/perlfaq4.pod

>=head2 How do I handle linked lists?

>In general, you usually don't need a linked list in Perl, since with
>regular arrays, you can push and pop or shift and unshift at either end,
>or you can use splice to add and/or remove arbitrary number of elements at
>arbitrary points.  Both pop and shift are both O(1) operations on perl's
>dynamic arrays.  In the absence of shifts and pops, push in general
>needs to reallocate on the order every log(N) times, and unshift will
>need to copy pointers each time.

>If you really, really wanted, you could use structures as described in
>L<perldsc> or L<perltoot> and do just what the algorithm book tells you
>to do.
>--
>See http://www.inwap.com/ for PDP-10 and "ReBoot" pages.



Sat, 18 Oct 2003 19:53:43 GMT  
 Why "out of memory"?
In comp.lang.perl.moderated,

Quote:




> >>  my perl 5.6 program dies with an "Out of memory!" error. It's
> >> creating some rather large data structures (probably around 1.5GB
> >> if it completed). Its memory usage is about 965 MB when it dies

> >When perl resizes arrays it always increases the allotment by a
> >factor of two.  Is the data in arrays? What happens if you rewrite
> >it to use linked lists?

> Linked lists?

> unix% perldoc -q linked
> =head1 Found in /usr/local/lib/perl5/5.00502/pod/perlfaq4.pod

> =head2 How do I handle linked lists?

> In general, you usually don't need a linked list in Perl, [...]

In general, you don't encounter "Out of memory!" errors in Perl, and
in general, you don't deal with 1.5GB of data in Perl. I have a hard
time believing that the intent of that FAQ entry was to say "Do not
try using linked lists when debugging memory allocation problems".

  -Rich

--
Rich Lafferty ----------------------------------------
 Nocturnal Aviation Division, IITS Computing Services
 Concordia University, Montreal, QC



Sat, 18 Oct 2003 22:14:27 GMT  
 Why "out of memory"?
[A complimentary Cc of this posting was sent to
Gregory K. Deal

Quote:
> I have 2 files I'm trying to compare. I load each file into its own hash
> with the following structure. I wouldn't be surprised if this is not the
> densest structure, but I still think my kernel parameters allow data sizes
> up to about 3.8GB.

a) Newer perls provide much more verbose info on failing malloc()
   (with Perl's malloc, which you use);

b) The maximal allocated chunk in your program is 8M long, so even
   with doubling, the failing request is for 16M;  peanuts;

c) I have a report that using Perl's malloc() against system's one
   saved 1.8G of the memory image, so people *do* use Perl to work
   with memory images circa 10G;

d) Linked lists in Perl would (in my estimates) use *much more*
   (5..10?) memory than Perl arrays;

e) Why not *test* memory on your machine? (untested)


Ilya

P.S.  I vaguely remember mentioning of Perl's malloc() failures with
      allocations of 2G-long chunks (explainable with older Perls).
      Is it with newer Perls?  Any details?



Sun, 19 Oct 2003 04:26:32 GMT  
 Why "out of memory"?
I tried the test memory line from "e)" below and I get a similar failure,
dying at just under 1GB. Since I couldn't get anyone at HP to diagnose, I'm
trying to determine what perl is actually doing when getting this error. Is
it doing a 'malloc' for data memory, running out of stack size, etc? If I
can determine which kernel parameter is being exceeded, I might have a
chance with modifying them or discussing with HP support. Is there any way
to use the PERL_DEBUG_MSTATS data provided before to determine this? Thanks.

j5000c  (ip3pd) 8:11am /home/autks [79] % perl -we '$|=1; $x = 1 x 8e6;

1111111111111111111111111111111111111111111111111111111111111111111111111111
111111111111111111111111111111111111111Out of memory!

Quote:

>[A complimentary Cc of this posting was sent to
>Gregory K. Deal



Quote:
>> I have 2 files I'm trying to compare. I load each file into its own hash
>> with the following structure. I wouldn't be surprised if this is not the
>> densest structure, but I still think my kernel parameters allow data
sizes
>> up to about 3.8GB.
>e) Why not *test* memory on your machine? (untested)


>Ilya



Sun, 19 Oct 2003 20:22:18 GMT  
 Why "out of memory"?

Quote:

> I'm still trying to determine why my perl 5.6 program dies with an "Out of
> memory!" error.
> This is perl, v5.6.0 built for 9000/777-hpux

so this is perl on a HPUX 10.20 ?? / 11 ??

The problem of malloc-ing much memory came up some time ago on the
HPUX-Devtools-List.
Below is the message from the list

HTH

Martin

Message from HPUX-Dev-Tools-List:

Hello Norbert,

Quote:
> I try to allocate all the memory in a loop (mem_eater) to test
> behavior of another application.
> Using new or malloc())  the call fails after 943 MB though 2 GB of
> memory is available.
> This is true on all ower HP machines.
> What is the reason for this behavior. Any kernel parameter ?
> I invoke sam to find this 943 limit.
> I add a sample called 'mem_eat.cpp'

Look for the kernel parameters maxdsiz (heap) and maxssiz (stack) and
(if it
is a 64-bit program) maxdsiz_64bit and maxssiz_64bit.

943 MB is close to 1GB. If it is a 32-bit program it cannot allocate
more
than 1 GB from the heap, see

  /usr/share/doc/proc_mgt.txt

You have 4 quadrants with 1 GB each, one for text, one for data and 2
for
shared memory.

If you need more than 1 GB data you can make your executable use the
text
quadrant for data too with

  chatr -N <executable>

If you still don't have enough, try

  chatr +q3p enable <executable>

and

  chatr +q4p enable <executable>

See man chatr. And make sure maxdsiz is large enough then :-). Or
simply
compile 64-bit, then you don't have the 1 GB limits.

Regards

Mario



Sun, 19 Oct 2003 20:45:00 GMT  
 Why "out of memory"?
[A complimentary Cc of this posting was sent to
Gregory K. Deal

Quote:
> I tried the test memory line from "e)" below and I get a similar failure,
> dying at just under 1GB. Since I couldn't get anyone at HP to diagnose, I'm
> trying to determine what perl is actually doing when getting this error.

It is sbrk()ing.  I think it is sbrk()ing for 30M (3% of the
footprint) - but I also expect it to retry with 8M when 30M sbrk() fails.

Quote:
> Is
> it doing a 'malloc' for data memory, running out of stack size, etc? If I
> can determine which kernel parameter is being exceeded, I might have a
> chance with modifying them or discussing with HP support. Is there any way
> to use the PERL_DEBUG_MSTATS data provided before to determine this? Thanks.

Lemme see...  No, it is not retrying with a smaller size.  And the
newer Perls do not report the actual parameter to sbrk(), only the
(approximate) argument to malloc().

Recompile with -g, and stop on the line

    } else if (cp == (char *)-1) { /* no more room! */
        ovp = (union overhead *)emergency_sbrk(needed);
        if (ovp == (union overhead *)-1)

Hope this helps,
Ilya



Mon, 20 Oct 2003 05:10:47 GMT  
 Why "out of memory"?

Quote:




> >>  my perl 5.6 program dies with an "Out of
> >> memory!" error. It's creating some rather large data structures (probably
> >> around 1.5GB if it completed). Its memory usage is about 965 MB when it
> >> dies

> >When perl resizes arrays it always increases the allotment by a
> >factor of two.  Is the data in arrays? What happens if you rewrite
> >it to use linked lists?

> Linked lists?

hell -- rewrite it in C with a very dense object.


Sun, 26 Oct 2003 11:46:00 GMT  
 
 [ 11 post ] 

 Relevant Pages 

1. SQL Server connect problem

2. "character class ""bug""

3. Newbie "Out of memory" query

4. local() results in "Out of Memory!"

5. Core dump with "in memory" file

6. "Tie" and memory usage

7. Please help, frustrating "Out of memory!"

8. Memory leak with Syslog or "do"?

9. "Out of Memory" in debugger

10. "Out of memory!"?

11. MacPerl: diagnosing "Out of Memory!"

12. Causes of "Out of Memory!"

 

 
Powered by phpBB® Forum Software