gnat and heap size 
Author Message
 gnat and heap size

Hi,

We want to use large arrays (well, large for us: 10000 x 10000 complex
numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap
exhausted) and have not yet found a way to increase it.

Thanks.

P.S. Please reply by e-mail.



Sun, 14 Mar 2004 02:29:48 GMT  
 gnat and heap size

Quote:
>We want to use large arrays (well, large for us: 10000 x 10000 complex
>numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap
>exhausted) and have not yet found a way to increase it.

By my math, that's probably going to be using at least 800MB. Do you actually
have that much RAM in your system? It would have to be one kick'n system to have
nearly a gig of contiguous free RAM available on the heap...

---
T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html



Sun, 14 Mar 2004 04:46:51 GMT  
 gnat and heap size
It would help to know what hardware/OS this is trying to run on since this
sounds like it is probably running up against some system limitation. Things
like VMS had virtual memory restricted by various settable parameters, so it
*might* run if the quotas were increased appropriately. But you'd need to
know the system to make the recommendations.

It wouldn't necessarily have to be a kick-a** machine to run something
needing 800mb of memory. Assuming it had virtual memory and sufficient disk
space, you might be able to get it to work. Albeit, the page swapping would
probably kill you, but it might actually run in a finite amount of time.
Virtual memory can be a cool thing if you're not in a big rush.

If virtual memory didn't exist, we'd have to invent it.

MDC
--
Marin David Condic
Senior Software Engineer
Pace Micro Technology Americas    www.pacemicro.com
Enabling the digital revolution

Web:      http://www.mcondic.com/


Quote:

> >We want to use large arrays (well, large for us: 10000 x 10000 complex
> >numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap
> >exhausted) and have not yet found a way to increase it.

> By my math, that's probably going to be using at least 800MB. Do you
actually
> have that much RAM in your system? It would have to be one kick'n system
to have
> nearly a gig of contiguous free RAM available on the heap...

> ---
> T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html




Sun, 14 Mar 2004 05:15:31 GMT  
 gnat and heap size

Quote:

>It would help to know what hardware/OS this is trying to run on since this
..
>It wouldn't necessarily have to be a kick-a** machine to run something
>needing 800mb of memory. Assuming it had virtual memory and sufficient disk
>space, you might be able to get it to work. Albeit, the page swapping would
>probably kill you, but it might actually run in a finite amount of time.

Well, he did say Solaris 7. That narrows it down a bit. If its one of those new
Starcat servers, it might even have way more than that just in RAM alone....

---
T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html



Sun, 14 Mar 2004 05:49:42 GMT  
 gnat and heap size

Quote:

> We want to use large arrays (well, large for us: 10000 x 10000 complex
> numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap
> exhausted) and have not yet found a way to increase it.

I have had a similar problem, and raised it on this NG a few months ago...

I was trying to store the past five years of the US stock markets in a big array.
I wanted to reload the data from disk "instantly", so I could do statistics easily.

The solution I came up with was to use an "mmap" call to map the data.  That way,
I could get an access value to data as big as the OS could give.  This was big
enough for my data (about 0.5GB).  Mapping the data back in was extremely quick.
Unfortunately, it makes the code a bit less portable, but that wasn't too bad.

But there were problems with mapping big records, and taking 'size attributes.
Since Ada works in bits, the sizes can overflow (signed) 32-bit representation.
This makes handling data in excess of 256MB tricky, since locations of
record elements are wrong, as are size attributes (eg a problem in generics).
I solved this by calculating the sizes of each element, multiplying by the
number of elements, adding in each record component's size.  {*filter*}.
But it was 10-100 times the speed of more obvious solutions.

I suspect you have a 64-bit architecture, and *may* not hit the 'size problems
I had with GNAT.  If you stick to simple arrays, you might be OK. I was using
GNAT 3.12p on a '686 processor.

I don't know what these other guys seem so surprised at.  Doesn't everybody
else have 1280MB of RAM nowadays, too?  ;-)  It's only a few hundred bucks!
--
Adrian Wrigley

(by the way... what do you use such large matrices for?)



Sun, 14 Mar 2004 07:10:20 GMT  
 gnat and heap size

Quote:

> By my math, that's probably going to be using at least 800MB. Do you actually
> have that much RAM in your system? It would have to be one kick'n system to have
> nearly a gig of contiguous free RAM available on the heap...

Why would it have to be contiguous? If I understand virtual memory right,
on almost any modern system, it'd just have to be a contiguous set of
virtual pages, which could then be placed individually throughout real
memory and swap space whereever.

--

Pointless website: http://dvdeug.dhis.org
When the aliens come, when the deathrays hum, when the bombers bomb,
we'll still be freakin' friends. - "Freakin' Friends"



Sun, 14 Mar 2004 06:40:01 GMT  
 gnat and heap size

Quote:

> By my math, that's probably going to be using at least 800MB. Do you actually
> have that much RAM in your system? It would have to be one kick'n system to have
> nearly a gig of contiguous free RAM available on the heap...

What an extraordinarily peculiar comment :-)

We are talking about contiguous *virtual* memory, not
physical memory. This is 2001, not 1965.



Sun, 14 Mar 2004 10:12:09 GMT  
 gnat and heap size

Quote:

> By my math, that's probably going to be using at least 800MB. Do you actually
> have that much RAM in your system? It would have to be one kick'n system to have
> nearly a gig of contiguous free RAM available on the heap...

And by the way (this being 2001), machines with 1-2 gig
of real memory are common (my little portable notebook
has 512 meg ...)


Sun, 14 Mar 2004 10:13:37 GMT  
 gnat and heap size

Quote:
>I don't know what these other guys seem so surprised at.  Doesn't everybody
>else have 1280MB of RAM nowadays, too?  ;-)  It's only a few hundred bucks!

No. Availability of huge amounts of computing and storage ressources are the
main reason for decreasing programmer and program designer skills resulting
in ever decreasing software quality. :-(

Still using NeXT stations, i386/32MB, ...



Sun, 14 Mar 2004 17:09:07 GMT  
 gnat and heap size
Must have missed the Solaris 7 part. Sorry. Not too familiar with how that
restricts memory use, but I seem to recall a similar situation quite a while
back in which there was one parameter that could be changed & a system
reboot performed that altered how much memory was available to a process.

My recollection was that Solaris did not have many memory management
parameters - VMS certainly had considerably more control over process quotas
on all sorts of resources.

MDC
--
Marin David Condic
Senior Software Engineer
Pace Micro Technology Americas    www.pacemicro.com
Enabling the digital revolution

Web:      http://www.mcondic.com/


Quote:
> Well, he did say Solaris 7. That narrows it down a bit. If its one of
those new
> Starcat servers, it might even have way more than that just in RAM
alone....



Sun, 14 Mar 2004 21:04:59 GMT  
 gnat and heap size

says...

Quote:


>> By my math, that's probably going to be using at least 800MB. Do you actually
>> have that much RAM in your system? It would have to be one kick'n system to have
>> nearly a gig of contiguous free RAM available on the heap...

>And by the way (this being 2001), machines with 1-2 gig
>of real memory are common (my little portable notebook
>has 512 meg ...)

Well, I'm perhaps not as impressed with that as you intended. After 7 or so
years of reading your posts Robert, one of the (perhaps few) things I've learned
is that you keep your laptop systems pretty close to state of the art. You of
course will dispute that, but compared to the P75 laptop I have here at my desk,
you are practicly in a separate historical age. I'd wadger that your laptop is
probably better than any of the 9 systems I have access to. :-)

---
T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html



Sun, 14 Mar 2004 21:29:59 GMT  
 gnat and heap size

says...

Quote:


>> By my math, that's probably going to be using at least 800MB. Do you actually
>> have that much RAM in your system? It would have to be one kick'n system to have
>> nearly a gig of contiguous free RAM available on the heap...

>What an extraordinarily peculiar comment :-)

>We are talking about contiguous *virtual* memory, not
>physical memory. This is 2001, not 1965.

Ahh yes, I had fogotten that. Realise that I do mostly RTOS work these days,
where we typically don't have virtual memory even here in 2001. But since he
said "Solaris 7", there definitely would be virtual memory involved.

---
T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html



Sun, 14 Mar 2004 21:36:05 GMT  
 gnat and heap size

Quote:

>Must have missed the Solaris 7 part. Sorry. Not too familiar with how that
>restricts memory use, but I seem to recall a similar situation quite a while
>back in which there was one parameter that could be changed & a system
>reboot performed that altered how much memory was available to a process.

>My recollection was that Solaris did not have many memory management
>parameters - VMS certainly had considerably more control over process quotas
>on all sorts of resources.

VMS also had a limit of about 1GB if I remember right. Its memory space was
something like 4GB, divided into 4 sections. At least that's what I remember
from 12 years ago or so when I was using it.

---
T.E.D.    homepage   - http://www.telepath.com/dennison/Ted/TED.html



Sun, 14 Mar 2004 21:39:42 GMT  
 gnat and heap size
The application is an electromagnetic simulation. Yes, we could use sparse
matrix techniques but part of the program requires a full matrix. Perhaps
we are hitting a low level limit (is that what Adrian is saying?).

Here is a test program.

        procedure matrix is
        type matrix is array ( integer range <>, integer range <> ) of float;
        type matrix_a is access matrix;
        A: matrix_a := New matrix( 1..9000, 1..9000 );
        i : integer;
        begin
        i := 1;
        while i <= 9000 loop
          A(i,i) := 1.0;
          i := i + 1;
        end loop;
        end matrix;

Quote:


> > We want to use large arrays (well, large for us: 10000 x 10000 complex
> > numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap
> > exhausted) and have not yet found a way to increase it.
> Since Ada works in bits, the sizes can overflow (signed) 32-bit
> representation. This makes handling data in excess of 256MB tricky,
> since locations of record elements are wrong, as are size attributes
> (eg a problem in generics). I solved this by calculating the sizes of
> each element, multiplying by the number of elements, adding in each
> record component's size.  {*filter*}. But it was 10-100 times the speed of
> more obvious solutions.

> I suspect you have a 64-bit architecture, and *may* not hit the 'size
> problems I had with GNAT.  If you stick to simple arrays, you might be
> OK. I was using GNAT 3.12p on a '686 processor.
> (by the way... what do you use such large matrices for?)

--

Defence Research Establishment Ottawa (DREO)    (613) 998-4901  FAX 998-2675
3701 Carling Avenue, Ottawa, Ontario  K1A 0Z4         http://www.*-*-*.com/


Sun, 14 Mar 2004 21:44:41 GMT  
 
 [ 43 post ]  Go to page: [1] [2] [3]

 Relevant Pages 

1. Ada generics / GNAT Heap-sort

2. arbitrary limits on heap size

3. MIT-Scheme heap size

4. HP-UX FORTRAN heap size

5. Executable size with GNAT for Windows and Linux

6. Stack size with GNAT for Windows

7. Problem taking 'Size of large array (gnat)

8. 'size works for SunAda but not GNAT

9. GNAT: Storage size?

10. Size Measurement for GNAT

11. Gnat Executable Size

12. File Size, Volume Size

 

 
Powered by phpBB® Forum Software