Performance of Perl scripts vs. compile C code 
Author Message
 Performance of Perl scripts vs. compile C code

I am about to start a project where I will use the web server to interface a
database.  I will be running Linux as my OS.

My question is, which is faster and has a lower overhead, having 20-30 users
hit the perl script or 20-30 users executing the compiled C program?  The perl
script or compiled C program would be a database search engine that outputs
HTML code.

These 20-30 users would be submitting forms and then running either the perl
or C code.

I plan to have 5 copies of the C program so that different users would access
the different programs for load balancing.  My hopes in this method is that i
could theoretically get 5-10 users per program copy * 5 copies = up to 50
simultaneaous users excuting the search program and searching the database.  
Or, can linux really handle 50 versions of one program running? (maybe i
should ask this question in unix.wizards *grin*  )

If I went the perl route, would their be a lot of system overhead for each
user that executes the perl script?

I am new in the arena of linux/unix and CGI.  I have done a lot of research in
the area and have seen the various programs and scripts that I can model my
CGI script.  At this point, I need some direction as to which path I should be
concentrating on...either perl scripts or C compiled program.

TIA,
Brandon



Tue, 18 Nov 1997 03:00:00 GMT  
 Performance of Perl scripts vs. compile C code

Quote:

>I am about to start a project where I will use the web server to interface a
>database.  I will be running Linux as my OS.
>My question is, which is faster and has a lower overhead, having 20-30 users
>hit the perl script or 20-30 users executing the compiled C program?

If you're good at writing C, you can beat Perl in excution speed.  But it
takes a lot longer to develop.  Is the extra development time worth the
speed difference?  Are you sure the speed of the search engine dominates
the response time of your server?

Quote:
>The perl script or compiled C program would be a database search engine
>that outputs HTML code.
>These 20-30 users would be submitting forms and then running either the perl
>or C code.
>I plan to have 5 copies of the C program so that different users would access
>the different programs for load balancing.  My hopes in this method is that i
>could theoretically get 5-10 users per program copy * 5 copies = up to 50
>simultaneaous users excuting the search program and searching the database.  
>Or, can linux really handle 50 versions of one program running? (maybe i
>should ask this question in unix.wizards *grin*  )

I'd be surprized if Linux couldn't even handle a measly 50 processes,
especially if they're all running the same code.

Quote:
>If I went the perl route, would their be a lot of system overhead for each
>user that executes the perl script?

I'd assume Linux is clever enough to share the perl interpreter code between
those 50 processes.

Quote:
>I am new in the arena of linux/unix and CGI.  I have done a lot of research in
>the area and have seen the various programs and scripts that I can model my
>CGI script.  At this point, I need some direction as to which path I should be
>concentrating on...either perl scripts or C compiled program.

If I were you, I'd write a prototype in Perl.  Tweak it until it has all
the features you want, and then see if it's fast enough.  If so, you're done.
Otherwise, your choice is between rewriting it in C or buying a faster machine.

--
Hope this helps,

HansM



Sat, 22 Nov 1997 03:00:00 GMT  
 Performance of Perl scripts vs. compile C code

: I am about to start a project where I will use the web server to interface a
: database.  I will be running Linux as my OS.

: My question is, which is faster and has a lower overhead, having 20-30 users
: hit the perl script or 20-30 users executing the compiled C program?  The perl
With very little information, I would guess that it depends on the
following 2 things:  disk access and search methods.  The actual
language will probably make less of a difference than these two
things.  

Obviously, the less disk access you perform the faster the program
will be.  Toward this end, you should use a cache mechanism of some
kind.  Writting your own is probably a bad idea, since there are many
freelly available.  I've recently come to like Msql which is written
in c and also has a perl4 and perl5 API.  Of course, dbm files in
either c or perl may also be efficient and appropriate.

The search method is also critical.  Msql will provide you will a
search method for the data.  Perl has excellent extensions for
scanning and modifying text.  Better than you could code in c
quickly.

Processing the text (input and output) can be done efficiently with perl.

In short, its your algorithm that's important not the language you
write it in.  Chances are that you will want to use associative arrays
somewhere for efficiency and if this is the case, perl will clearly
win.

--peter

--

=== PETER KOCKS =====================================================



Tue, 25 Nov 1997 03:00:00 GMT  
 
 [ 3 post ] 

 Relevant Pages 

1. An attempt to compare the performance of perl vs compiled perl vs C

2. Hot novel by CS professor, includes perl code

3. Compiling Perl scripts into byte code on Win32

4. poor performance of activeperl vs MKS Perl

5. Performance of perl vs awk and sed

6. poor performance of activeperl vs MKS Perl

7. performance of java vs perl

8. good code vs code that works

9. file access vs DB access relative performance

10. Poor performance isqlperl vs perl5+dbi+dbd

11. Performance of DBD::Oracle Vs. JDBC

12. Subroutines:: Return Type Vs Performance

 

 
Powered by phpBB® Forum Software