Home » Php » testing – best way to measure (and refine) performance with PHP?

testing – best way to measure (and refine) performance with PHP?

Posted by: admin July 12, 2020 Leave a comment


A site I am working with is starting to get a little sluggish, and I would like to refine it. I think the problem is with the PHP, but I can’t be sure. How can I see how long functions are taking to perform?

How to&Answers:

If you want to test the execution time :

    $startTime = microtime(true);  
    // Your content to test
    $endTime = microtime(true);  
    $elapsed = $endTime - $startTime;
    echo "Execution time : $elapsed seconds";


Try the profiler feature in XDebug or Zend Debugger?


Two things you can do.
place Microtime calls everywhere although its not convenient if you want to test more than one function. So there is a simpler way to do it a better solution if you want to test many functions which i assume you would like to do.
just have a class (click on link to follow tutorial) where you can test how long all your functions take. Rather than place microtime everywhere. you just use this class. which is very convenient


the second thing you can do is to optimize your script is by taking a look at the memory usage.
By observing the memory usage of your scripts, you may be able optimize your code better.

PHP has a garbage collector and a pretty complex memory manager. The amount of memory being used by your script. can go up and down during the execution of a script. To get the current memory usage, we can use the memory_get_usage() function, and to get the highest amount of memory used at any point, we can use the memory_get_peak_usage() function.
view plaincopy to clipboardprint?

   echo "Initial: ".memory_get_usage()." bytes \n";  
   /* prints 
   Initial: 361400 bytes 

   // let's use up some memory  
   for ($i = 0; $i < 100000; $i++) {  
       $array []= md5($i);  

  // let's remove half of the array  
  for ($i = 0; $i < 100000; $i++) {  

  echo "Final: ".memory_get_usage()." bytes \n";  
  /* prints 
  Final: 885912 bytes 

  echo "Peak: ".memory_get_peak_usage()." bytes \n";  
  /* prints 
  Peak: 13687072 bytes 




  • You can also make it manually, by recording microtime() value in various places, like this:

    // some code
    $query="SELECT ...";
    $TIMER['before q']=microtime(TRUE);
    $TIMER['after q']=microtime(TRUE);  
    while ($row = mysql_fetch_array($res)) {
    // some code
    $TIMER['array filled']=microtime(TRUE);  
    // some code
    /and so on

and then visualize it

if ('' === $_SERVER['REMOTE_ADDR']) {
  echo "<table border=1><tr><td>name</td><td>so far</td><td>delta</td><td>per cent</td></tr>";
  foreach($TIMER as $name => $value) {
    echo "<tr><td>$name</td><td>$sofar</td><td>$delta</td><td>$percent</td></tr>";
    echo "</table>";

an IP address check implies that we are doing this profiling on the working site

  • Though I doubt it’s PHP itself. Most likely it’s database. So, pay most attention to query execution timing.

  • however, a “site” term is very broad. It includes also JS, CSS, images and stuff. So, I’d suggest to start form FirebFug’s Net page to see what part of whole page takes more time.

Of course, refining can be done only after analysis of profiling results, and cannot be advised here without it.


Your best bet is Xdebug. Im happy as it comes bundled in my PHPed IDE. I can get profiler data at the click of a button.

So maybe you could consider that.


I had similar issues and so I created 2 new tables on the database and two new functions. One was audit_sql and the other was audit_code. Because I used an SQL abstraction class it was easy to time every single SQL call (I used php microtime as some others have suggested). So, I called microtime before and after the SQL call and stored the results on the database.

Similarly with pages. I called microtime at the start and end of each page and if necessary at the start and end of functons, divs – whatever I thought might be a culprit.

The general results were:

  1. SQL calls to MySQL were almost instantaneous and were nto a problem at all. The only thing I would say is that even I was surprised at the number being executed! The site is generated from the database – even the menus, permissions etc. To produce the home page the SQL calls were measured in the 100s.

  2. PHP was not the culprit. This was even more instantaneous that MySQL.

  3. The culprit was…. (big build up!) calls to You Tube and Picassa and other sites like that. I host videos and photo albums on the site (well, I don’t actually store them – they are stored on YT etc.) and on the home page are thumbnails that are extracted from You Tube and the like via the You Tube PHP API/Zend Framework. Because this is all http based to the other sites, each one was taking 1, 2 or 3 seconds. This was causing those divs containing these to take between 6 and 12 seconds and the home page up to 17 seconds.

The solution – store all thumbnails on my server. The first time one has to be served from the remote site (YT, Picassa etc.) so do that and then store it on your own site. Future times, you check if you have it and if so serve it always from your server. Cuts the page load time down to 2-3 seconds tops. Granted the first person to view the first home page load after someone has loaded more videos/images will take some time, but not thereafter. People will put a long one-off page load time down to their connection/the internet in general. Too many slow loads of your site and they will stop visiting!

I hope that helps somewhat.