Home » Php » Benchmarking Performance of node.js (cluster) with mysql pools : Lighttpd + PHP?

Benchmarking Performance of node.js (cluster) with mysql pools : Lighttpd + PHP?

Posted by: admin July 12, 2020 Leave a comment


Edit(2): Now using db-mysql with generic-pool module. The error rate has dropped significantly and hovers at 13% but the throughput is still around 100 req/sec.

Edit(1): After someone suggesting that ORDER BY RAND() would cause MySQL to be slow, I had removed that clause from the query. Node.js now hovers around 100 req/sec but still the server reports ‘CONNECTION error: Too many connections’.

Node.js or Lighttpd with PHP?

You probably saw many “Hello World” benchmarking of node.js… but “hello world” tests, even those that were delayed by 2 seconds per request, are not even close to real world production usage. I also performed those variations of “Hello World” tests using node.js and saw throughput of about 800 req/sec with 0.01% error rate. However, I decided to some tests that were a bit more realistic.

Maybe my tests are not complete, most likely something is REALLY wrong about node.js or my test code and so if your a node.js expert, please do help me write some better tests. My results are published below. I used Apache JMeter to do the testing.

Test Case and System Specs

The test is pretty simple. A mysql query for number of users is ordered randomly. The first user’s username is retrieved and displayed. The mysql database connection is through a unix socket. The OS is FreeBSD 8+. 8GB of RAM. Intel Xeon Quad Core 2.x Ghz processor. I tuned the Lighttpd configurations a bit before i even came across node.js.

Apache JMeter Settings

Number of threads (users) : 5000 I believe this is the number of concurrent connections

Ramp up period (in seconds) : 1

Loop Count : 10 This is the number of requests per user

Apache JMeter End Results

Label                  | # Samples | Average  | Min   | Max      | Std. Dev. | Error % | Throughput | KB/sec | Avg. Bytes

HTTP Requests Lighttpd | 49918     | 2060ms   | 29ms  | 84790ms  | 5524      | 19.47%  | 583.3/sec  | 211.79 | 371.8

HTTP Requests Node.js  | 13767     | 106569ms | 295ms | 292311ms | 91764     | 78.86%  | 44.6/sec   | 79.16  | 1816

Result Conclusions

Node.js was so bad i had to stop the test early. [Fixed Tested completely]

Node.js reports “CONNECTION error: Too many connections” on the server. [Fixed]

Most of the time, Lighttpd had a throughput of about 1200 req/sec.

However, node.js had a throughput of about 29 req/sec. [Fixed Now at 100req/sec]

This is the code i used for node.js (Using MySQL pools)

var cluster = require('cluster'), http = require('http'), mysql = require('db-mysql'), generic_pool = require('generic-pool');

var pool = generic_pool.Pool({
    name: 'mysql',
    max: 10,
    create: function(callback) {
        new mysql.Database({
            socket: "/tmp/mysql.sock",
            user: 'root',
            password: 'password',
            database: 'v3edb2011'
        }).connect(function(err, server) {
            callback(err, this);
        destroy: function(db) {

var server = http.createServer(function(request, response) {  
    response.writeHead(200, {"Content-Type": "text/html"});  
    pool.acquire(function(err, db) {
        if (err) {
            return response.end("CONNECTION error: " + err);

        db.query('SELECT * FROM tb_users').execute(function(err, rows, columns) {

            if (err) {
                return response.end("QUERY ERROR: " + err);
            response.write(rows.length + ' ROWS found using node.js<br />');

  .set('workers', 5)

This this is the code i used for PHP (Lighttpd + FastCGI)

  $conn = new mysqli('localhost', 'root', 'password', 'v3edb2011');
  if($conn) {
    $result = $conn->query('SELECT * FROM tb_users ORDER BY RAND()');
    if($result) {
      echo ($result->num_rows).' ROWS found using Lighttpd + PHP (FastCGI)<br />';
      $row = $result->fetch_assoc();
      echo $row['username'];
    } else {
      echo 'Error : DB Query';
  } else {
    echo 'Error : DB Connection';
How to&Answers:

This is a bad benchmark comparison. In node.js your selecting the whole table and putting it in an array. In php your only parsing the first row. So the bigger your table is the slower node will look. If you made php use mysqli_fetch_all it would be a similar comparison. While db-mysql is supposed to be fast it’s not very full featured and lacks the ability to make this a fair comparison. Using a different node.js module like node-mysql-libmysqlclient should allow you to only process the first row.


100 connections is the default setting for MySQL maximum number of connections.

So somehow your connections aren’t being reused for different requests. Probably you already have one query running on each connection.

Maybe the nodejs MySQL library you are using will not queue queries on the same MySQL connection but try to open an other connection and fail.


Correct me if I’m wrong, but I feel like you are overlooking something: Node uses a single process to handle every request (and handles them through events, still the same process), while php gets a new process (thread) for every request.

The problem with this is that the one process from node sticks to one core of the CPU, and PHP gets to scale with all four cores through multi-threading. I would say that with a Quad Core 2.x GHz processor, PHP would definitely have a significant advantage over Node just through being able to utilize the extra resources.

There is another discussion giving some information about how to scale Node over multiple cores, but that has to be done explicitly through coding. Again, correct me if I’m wrong, but I don’t see any such code in the example above.

I’m pretty new to Node myself, but I hope this helps you improve your test 🙂


Have you enabled APC with PHP?

Can you try to enable persistent connections with PHP?

$conn = new mysqli('p:localhost', 'root', 'password', 'v3edb2011');


Aren’t you using 10 maximum MySQL connections in Node.js, and 5000 maximum MySQL connections via PHP?

While you run your tests on either system, I would take a look at MySQL’s “SHOW FULL PROCESSLIST”.


One thing to consider is the driver – performance to databases can be very tied into the specific driver you are using. The most popular mysql driver and the one that is the most actively maintained is https://github.com/felixge/node-mysql. Might get different results with that.

But if you are stuck at 100 connections, sounds like connections are not being properly closed. I might add a console.log statement in the pools destroy event to make sure it really is executing.


This is a bad benchmark, it should be a simple “hello world”, as thousands of benchmarks that prove that nodejs is the “hello world” server fastest of all time 😀