Possible Bug: Multiple DBM Files at once 
Author Message
 Possible Bug: Multiple DBM Files at once

Ok, I give up. I thought I understand this language. But I have the
following problem, I stumbled over it in perl 5.003 and perl 5.004,
using either NDBM oder DB data bases. This is on a SUN Ultra 1 running
Solaris 2.5.1

Basically I do the following thing:

tie %firsthash, "DB_File", "list of entries", O_CREAT|O_RDWR, 0664)

foreach $i (keys %firsthash)
{
        tie %secondhash, "DB_File", "$DATADIR/$i", O_CREAT|O_RDWR, 0664)
        [ modify some fields in %secondhash ]
        untie %secondhash;

Quote:
}

untie %firsthash;

though I have to iterate over about 2000 database files, I should never
have more than two database files open at once.

But I run after about 60 ties's (db) or 30 ties's (NDBM) into "Too
many open files". Reproduceable. And fuser tells me, that my program
has indeed 60+ files open with an ulimit of 64. So the computer is
right ({*filter*} :-( )

According to the man page, untie should destroy the database binding
and close the file(s). Why doesn't it do this? What am I doing wrong?
Or is this really a bug in perl?

Is there a way to circumvent this limitation?

        Kind regards
                Henning

[ P.s.: Please answer also via E-Mail, as I am not a regular reader
        of c.l.p.m ]

--
--

Regionalkontakt Nuernberg Kommunikationsnetz Franken e.V.



Tue, 07 Dec 1999 03:00:00 GMT  
 Possible Bug: Multiple DBM Files at once

Quote:

> Ok, I give up. I thought I understand this language. But I have the
> following problem, I stumbled over it in perl 5.003 and perl 5.004,
> using either NDBM oder DB data bases. This is on a SUN Ultra 1 running
> Solaris 2.5.1

> Basically I do the following thing:

> tie %firsthash, "DB_File", "list of entries", O_CREAT|O_RDWR, 0664)

> foreach $i (keys %firsthash)
> {
>         tie %secondhash, "DB_File", "$DATADIR/$i", O_CREAT|O_RDWR, 0664)
>         [ modify some fields in %secondhash ]
>         untie %secondhash;
> }
> untie %firsthash;

> though I have to iterate over about 2000 database files, I should never
> have more than two database files open at once.

> But I run after about 60 ties's (db) or 30 ties's (NDBM) into "Too
> many open files". Reproduceable. And fuser tells me, that my program
> has indeed 60+ files open with an ulimit of 64. So the computer is
> right ({*filter*} :-( )

Henning,

I have come across a similar problem in my work.  I posted the problem
about 6 months ago to c.l.p.m. and have yet to receive a response.  I
store 0 to 200 records daily to my DBM database, and have had to use
perl 4.036 to do it correctly.  I am also running Perl 5.003, using
Solaris 2.5.1 on a Sun Ultra 1 system.  I have tried using the tie/untie
with each of the five DBM modules listed in the latest Camel book, and
each gives me the same problem.  After about 25-30 records are
processed, all tie commands and system calls return an error "Too many
open files".

Due to compliance directives from on high, we are to remove Perl 4.036
from our system in the near future.

Has anyone responded yet to your request (by e-mail, perhaps)?  If not,
would someone out there PLEASE look into this.  It is definitely a
repeatable and consistent error on my platform.

A snippet of code for reference:

use Fcntl;
use NDBM_File;

[snip ... building hashes here]
[foreach record]

    tie( %DB_OBJECT, "NDBM_File", "$pdis_db/$ctrl",
         O_RDWR | O_CREAT | O_EXCL, 0644
       );

    %DB_OBJECT = %text_obj;

    untie( %DB_OBJECT );

Thanks in advance.

--


EDS/GM Truck Engineering             (248) 753-4739 (8-238)



Sat, 11 Dec 1999 03:00:00 GMT  
 Possible Bug: Multiple DBM Files at once



Quote:
>> Basically I do the following thing:

>> tie %firsthash, "DB_File", "list of entries", O_CREAT|O_RDWR, 0664)

>> foreach $i (keys %firsthash)
>> {
>>         tie %secondhash, "DB_File", "$DATADIR/$i", O_CREAT|O_RDWR, 0664)
>>         [ modify some fields in %secondhash ]
>>         untie %secondhash;
>> }
>> untie %firsthash;

>> though I have to iterate over about 2000 database files, I should never
>> have more than two database files open at once.

>> But I run after about 60 ties's (db) or 30 ties's (NDBM) into "Too
>> many open files". Reproduceable. And fuser tells me, that my program
>> has indeed 60+ files open with an ulimit of 64. So the computer is
>> right ({*filter*} :-( )

>Henning,

>I have come across a similar problem in my work.  I posted the problem
>about 6 months ago to c.l.p.m. and have yet to receive a response.  I
>store 0 to 200 records daily to my DBM database, and have had to use
>perl 4.036 to do it correctly.  I am also running Perl 5.003, using
>Solaris 2.5.1 on a Sun Ultra 1 system.  I have tried using the tie/untie
>with each of the five DBM modules listed in the latest Camel book, and
>each gives me the same problem.  After about 25-30 records are
>processed, all tie commands and system calls return an error "Too many
>open files".

>Due to compliance directives from on high, we are to remove Perl 4.036
>from our system in the near future.

>Has anyone responded yet to your request (by e-mail, perhaps)?  If not,
>would someone out there PLEASE look into this.  It is definitely a
>repeatable and consistent error on my platform.

How does this chunk of code work for you?  I'm using perl 5.004_01 on
linux (kernel 2.0.30)

$ ulimit -a
core file size (blocks)  1000000
data seg size (kbytes)   unlimited
file size (blocks)       unlimited
max memory size (kbytes) unlimited
stack size (kbytes)      8192
cpu time (seconds)       unlimited
max user processes       256
pipe size (512 bytes)    8
open files               256
virtual memory (kbytes)  2105343
$ cat try.pl
#!/usr/local/bin/perl -w

use DB_File;
use Fcntl;

tie %i, 'DB_File', "file-0", O_CREAT|O_RDWR, 0664;
$i{'pid'} = $$;
$i{'time'} = time;
for ($i = 1; $i < 5000; $i++) {
  unless (tie %h, 'DB_File', "file-$i", O_CREAT|O_RDWR, 0664) {
    die "failed at $i\n";
  }
  $h{'test'} = 'set';
  $h{'pid'} = $$;
  $h{'time'} = time;
  untie %h;

Quote:
}

untie %i;

__END__
$ time ./try.pl
6.93user 19.30system 7:35.21elapsed 5%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (20300major+209minor)pagefaults 0swaps

seems reasonable behaviour to me, if it were leaking fds I should have
died some time...

Mike

--

http://www.*-*-*.com/ ~mike/       |   PGP fingerprint FE 56 4D 7D 42 1A 4A 9C
http://www.*-*-*.com/ ;  |                   65 F3 3F 1D 27 22 B7 41



Sat, 11 Dec 1999 03:00:00 GMT  
 
 [ 3 post ] 

 Relevant Pages 

1. Possible bug in dbm routine?!?!

2. Possible Perl 5.001 dbm bug?

3. Possible bug pl 18/19 bug in file I/O SunOS 4.1.1 SPARC

4. How to do multiple files at once?

5. How to upload multiple graphic files at once ?

6. How to upload multiple graphic files at once ?

7. is it possible to rename multiple files?

8. Reading multiple DBM files

9. Reading multiple DBM files

10. Using multiple dbm files

11. Multiple access to dbm files

12. Opening Multiple dbm files - help

 

 
Powered by phpBB® Forum Software