Apache has a great load testing tool that appears to rival the best out there. It can test against concurrent connections, number of loads per connection, and give understandable results – actually useful! Anyone who develops wants to know how much stress (how many visitors) their application can withstand and how it will perform under a real world load.
My OS is Ubuntu (Natty atm), and apt makes it easy to install by just typing:
gmilby@mini64:~/Dropbox$ sudo apt-get install apache2-utils
1. Number of simulated connections – this simulates the number of visitors to your webapp, and if there are bugs and code errors, it will result to a very poor performance in actual deployment which can slowed down if there are a lot of users in your website.
2. Number of simulated concurrent users – these are the actual number of users that are using the application at exactly the same time. Of course in a very low traffic website, the concurrent users is almost zero because the probability that two users will exactly hit the application and do the same request is very small. This can be an issue with high traffic websites; it is because the number of concurrent users can slowed down the application and put a significant load on the web server. Some web host even limits the number of concurrent users (typically those that are originating from a free web hosting scheme).
Having the ability for the average developer to have access to such a tool as this enables them to be able to at least be aware of potential problems, and can give the developer enough time to research alternate coding methods. It can also shape the actual development. If you start using a huge framework, and realize how much bulk is being carried along to deploy a small app – it may be best to scale it down – look for a more practical approach. Knowing how your site will perform through every stage can save valuable time and embarrassment.
<code>ab -n [number of simulated connections] -c [number of simulated concurrent users] [url]
ab -n 3000 -c 3 http://mall.syrbot.com:81/ > main-mall-results.txt (it is not required, but i like to have a copy of my test results - to refer to)
This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking mall.syrbot.com (be patient)
Server Software: nginx/0.7.67
Server Hostname: mall.syrbot.com
Server Port: 81
Document Path: /
Document Length: 16447 bytes
Concurrency Level: 3
Time taken for tests: 160.446 seconds
Complete requests: 3000
Failed requests: 0
Write errors: 0
Total transferred: 49821000 bytes
HTML transferred: 49341000 bytes
Requests per second: 18.70 [#/sec] (mean)
Time per request: 160.446 [ms] (mean)
Time per request: 53.482 [ms] (mean, across all concurrent requests)
Transfer rate: 303.24 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 39 53 95.0 49 3051
Processing: 81 108 88.1 107 4670
Waiting: 41 53 10.5 53 335
Total: 125 160 129.9 157 4720
Percentage of the requests served within a certain time (ms)
100% 4720 (longest request)
These results show that the longest render of the website took 4720 milliseconds. Which is not bad considering it is the bottom of the line VPS, and pushing a ton of jQuery, Python and CSS for each page render (all 3000 requests!). the average was 157 milliseconds – so my choice to use NGINX to serve and Bjeorn pure C wsgi server turned out to be a good choice this time.
You can bet I will be using this from here on out. It helps me to know that new technologies may be reliable. It’s easier to trust a technology if you know the load it can stand.