database / website performance benchmarking 
Author Message
 database / website performance benchmarking

Hi,

I am managing a number of different php and database driven websites
on a couple of servers and I would like to ask wath tools that are
available for monitoring how the websites behaves under heavy load,
etc. Something like a program that could be run on a couple of client
machines where each instance of the program would emulate a number of
simultaneous users. And then something on the server side that could
record the memory and CPU usage and plot a nice graph. Perhaps
something like top|gplot with the appropriate flags would work but
that's just a wild idea. Would be nice also to have a graph showing
how long time a page takes to load depending on the number of
simultaneous users.

I've seen many graphs like that shows the page load times depending on
the number of users, especially in the context of database benchmarks.
But how does one produce them? Would be really nice to smack up in a
powerpoint on a management meeting, really gives the impression that
things are under control...

Any and all ideas would be appreciated!!

// Mikael



Sun, 31 Jul 2005 18:57:44 GMT  
 database / website performance benchmarking
(Cross-posted to c.i.w.s.u and follow-ups set to that group)

Quote:

> I am managing a number of different php and database driven websites
> on a couple of servers and I would like to ask wath tools that are
> available for monitoring how the websites behaves under heavy load,
> etc. Something like a program that could be run on a couple of client
> machines where each instance of the program would emulate a number of
> simultaneous users. And then something on the server side that could
> record the memory and CPU usage and plot a nice graph. Perhaps
> something like top|gplot with the appropriate flags would work but

If you have apache, http://localhost/server-status could be useful for
general monitoring.

For your question, I'd suggest a combination of wget on the clients to
spider through the site and generate a series of requests to a range
of pages, and a short shell script that does an infinite loop of

uptime | cut [bits I want] >> usage.stats ; echo >> usage.stats ; sleep 20

to get the load data on the server.  You probably just want the load
average numbers.

You should be able to put that into a graphing package with a little
bit of text-processing in sed or perl.

Quote:
> that's just a wild idea. Would be nice also to have a graph showing
> how long time a page takes to load depending on the number of
> simultaneous users.

That will be more difficult, since simultaneous users are difficult to
measure.  If you count active apache servers then you can get a rough
idea, though.  ("ps aux | grep -c [a]pache" should work - the [a] is
to stop the grep itself showing up).  Measuring page load times is
also going to be quite difficult, though I *think* you can modify
server config to log this.

--
Chris



Sun, 31 Jul 2005 19:46:32 GMT  
 database / website performance benchmarking

Quote:

> I am managing a number of different php and database driven websites
> on a couple of servers and I would like to ask wath tools that are
> available for monitoring how the websites behaves under heavy load,
> etc. Something like a program that could be run on a couple of client
> machines where each instance of the program would emulate a number of
> simultaneous users. And then something on the server side that could
> record the memory and CPU usage and plot a nice graph.

I'm not sure if you've already looked into apache bench or not (or if you're
even using Apache), your description seems very specific, as if you're
intentionally exlcuding it as a possibility since it doesn't meet all of the
requirements. If that was your intention, I apologize. However, if you
haven't looked into it, you may find it useful. It doesn't have pretty
graphs, or give you memory and cpu usage, but it does have other useful
information. It would be easy enough to write a shell script to perform
multiple tests and provide the results in an easier to read fashion.

Following is a sample output I just ran on my phpinfo.php (-n is number of
requests, -c is number of concurrent requests).

$ ab -n 1000 -c 10 http://localhost/phpinfo.php
This is ApacheBench, Version 1.3d <$Revision: 1.67 $> apache-1.3
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright (c) 1998-2002 The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
<snip>
Finished 1000 requests

<snip server info>

Concurrency Level:      10
Time taken for tests:   10.757 seconds
Complete requests:      1000
Failed requests:        0
Broken pipe errors:     0
Total transferred:      32266512 bytes
HTML transferred:       32100357 bytes
Requests per second:    92.96 [#/sec] (mean)
Time per request:       107.57 [ms] (mean)
Time per request:       10.76 [ms] (mean, across all concurrent requests)
Transfer rate:          2999.58 [Kbytes/sec] received

Connnection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    13    7.2     12    36
Processing:    30    91   38.1     86   481
Waiting:       18    84   38.8     79   481
Total:         30   104   36.9     98   486

Percentage of the requests served within a certain time (ms)
  50%     98
  66%    102
  75%    104
  80%    107
  90%    113
  95%    130
  98%    162
  99%    380
 100%    486 (last request)
END ab output

HTH



Mon, 01 Aug 2005 07:02:19 GMT  
 database / website performance benchmarking

Quote:
> I am managing a number of different php and database driven websites
> on a couple of servers and I would like to ask wath tools that are
> available for monitoring how the websites behaves under heavy load,
> etc. Something like a program that could be run on a couple of client
> machines where each instance of the program would emulate a number of
> simultaneous users. And then something on the server side that could
> record the memory and CPU usage and plot a nice graph. Perhaps
> something like top|gplot with the appropriate flags would work but
> that's just a wild idea. Would be nice also to have a graph showing
> how long time a page takes to load depending on the number of
> simultaneous users.

> I've seen many graphs like that shows the page load times depending on
> the number of users, especially in the context of database benchmarks.
> But how does one produce them? Would be really nice to smack up in a
> powerpoint on a management meeting, really gives the impression that
> things are under control...

we use LoadRunner by Mercury Interactive for this. it really is great
for doing complex load test scenarios and producing management
compatible reports and charts about that...
unfortunately it costs about an arm and a leg.

joachim



Mon, 01 Aug 2005 07:31:22 GMT  
 database / website performance benchmarking
No I had not heard about apache bench before and it's pretty much
exactly what I was looking for. I've asked around a bit in other
places and got the same response so it seems like the "standard" way
to do this. Thanks!! My specific description was just what I thought I
was looking for..

// Mikael

Quote:


> > I am managing a number of different php and database driven websites
> > on a couple of servers and I would like to ask wath tools that are
> > available for monitoring how the websites behaves under heavy load,
> > etc. Something like a program that could be run on a couple of client
> > machines where each instance of the program would emulate a number of
> > simultaneous users. And then something on the server side that could
> > record the memory and CPU usage and plot a nice graph.

> I'm not sure if you've already looked into apache bench or not (or if you're
> even using Apache), your description seems very specific, as if you're
> intentionally exlcuding it as a possibility since it doesn't meet all of the
> requirements. If that was your intention, I apologize. However, if you
> haven't looked into it, you may find it useful. It doesn't have pretty
> graphs, or give you memory and cpu usage, but it does have other useful
> information. It would be easy enough to write a shell script to perform
> multiple tests and provide the results in an easier to read fashion.

> Following is a sample output I just ran on my phpinfo.php (-n is number of
> requests, -c is number of concurrent requests).

> $ ab -n 1000 -c 10 http://localhost/phpinfo.php
> This is ApacheBench, Version 1.3d <$Revision: 1.67 $> apache-1.3
> Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
> Copyright (c) 1998-2002 The Apache Software Foundation, http://www.apache.org/

> Benchmarking localhost (be patient)
> Completed 100 requests
> <snip>
> Finished 1000 requests

> <snip server info>

> Concurrency Level:      10
> Time taken for tests:   10.757 seconds
> Complete requests:      1000
> Failed requests:        0
> Broken pipe errors:     0
> Total transferred:      32266512 bytes
> HTML transferred:       32100357 bytes
> Requests per second:    92.96 [#/sec] (mean)
> Time per request:       107.57 [ms] (mean)
> Time per request:       10.76 [ms] (mean, across all concurrent requests)
> Transfer rate:          2999.58 [Kbytes/sec] received

> Connnection Times (ms)
>               min  mean[+/-sd] median   max
> Connect:        0    13    7.2     12    36
> Processing:    30    91   38.1     86   481
> Waiting:       18    84   38.8     79   481
> Total:         30   104   36.9     98   486

> Percentage of the requests served within a certain time (ms)
>   50%     98
>   66%    102
>   75%    104
>   80%    107
>   90%    113
>   95%    130
>   98%    162
>   99%    380
>  100%    486 (last request)
> END ab output

> HTH



Mon, 01 Aug 2005 22:33:05 GMT  
 database / website performance benchmarking

Quote:

> > I am managing a number of different php and database driven websites
> > on a couple of servers and I would like to ask wath tools that are
> > available for monitoring how the websites behaves under heavy load,
> > etc. Something like a program that could be run on a couple of client
> > machines where each instance of the program would emulate a number of
> > simultaneous users. And then something on the server side that could
> > record the memory and CPU usage and plot a nice graph. Perhaps
> > something like top|gplot with the appropriate flags would work but
> > that's just a wild idea. Would be nice also to have a graph showing
> > how long time a page takes to load depending on the number of
> > simultaneous users.

> > I've seen many graphs like that shows the page load times depending on
> > the number of users, especially in the context of database benchmarks.
> > But how does one produce them? Would be really nice to smack up in a
> > powerpoint on a management meeting, really gives the impression that
> > things are under control...

> we use LoadRunner by Mercury Interactive for this. it really is great
> for doing complex load test scenarios and producing management
> compatible reports and charts about that...
> unfortunately it costs about an arm and a leg.

You may want to goto the FORUM: www.qaforums.com. Most of the industry
standard test tools are discussed there!

We use Loadrunner as well generating load, but we use Quest SQLab
Vision in conjunction with loadrunner to monitor databases, and
isolate database bottlenecks, and produce reports.

 Hope this helps.

thanks, Mohit

- Show quoted text -

Quote:
> joachim



Sat, 06 Aug 2005 01:16:13 GMT  
 
 [ 6 post ] 

 Relevant Pages 

1. Performance Benchmarking

2. performance benchmarking

3. Participate in Survey about Database Benchmarking

4. Dynamic Websites and Tcl Performance

5. Handling forms on database driven websites

6. Copying info from a website into my database?

7. Open Source database performance comparisons

8. CW LARGE Database Performance OK?

9. Performance problem with large database

10. US-MI Performance Database Systems Engineers

11. malloc performance affecting python performance

12. ST80 performance; STA performance and booleans

 

 
Powered by phpBB® Forum Software