Home » Php » performance – Most expensive operations in PHP?

performance – Most expensive operations in PHP?

Posted by: admin July 12, 2020 Leave a comment


What are some of the most expensive operations in PHP? I know things like overusing the @ operator can be expensive. What else would you consider?

How to&Answers:
  • serialize() is slow, as is eval(), create_function(), and spawning additional processes via system() and related functions.
  • beware of anything APC can’t cache — conditional includes, eval()ed code, etc.
  • Opening database connections. Always cache your connections and re-use them.
  • Object cloning
  • Regular expressions. Always use the normal string operations over a regular expression operation if you don’t need the functionality of a regexp, e.g. use str_replace() over preg_replace() where possible.
  • Logging and disk writes can be slow – eliminate unnecessary logging and file operations

Some micro-optimizations that are good practice, but won’t make much difference to your bottom line performance:

  • Using echo is faster than print
  • Concatenating variables is faster than using them inline in a double-quoted string.
  • Using echo with a list of arguments is faster than concatenating the arguments. Example: echo 'How are you ',$name,' I am fine ',$var1 is faster than echo 'How are you '.$name.' I am fine '.$var1
  • Develop with Notices and Warnings turned on. Making sure they don’t get triggered saves PHP from having to run error control on them.


Rather than trying to figure out potential areas that are slow, use a profiling tool. Installing xDebug was probably one of the easiest and best things I’ve done to improve the code I write. Install with WinCacheGrind (or the correct version for your OS) for best results.


 "Hello $name"

syntax is slower than

'Hello ' . $name

also __get() __set() __call(), etc are slow

and, if you care so much, you can use optimized structures from SPL


Anything that’s going though a network connection — like calling a webservice, for instance : it’ll generally take more time than doing an operation locally.

(Even if it doesn’t cost much CPU, it’ll cost time)


I’d say SQL queries inside loops. Such as this:

foreach ($db->query('SELECT * FROM categories') as $cat)
    foreach ($db->query('SELECT * FROM items WHERE cat_id = ' . $cat['cat_id']) as $item)

Which, for the record, could be shortened into something like this:

$sql = 'SELECT c.*, i.*
          FROM categoriess c
     LEFT JOIN items i USING (cat_id)
      ORDER BY c.cat_order';

foreach ($db->query($sql) as $row)


curl_exec() is very slow, compared to typical operations. Also, most str_* operations are faster than regex operations.


  • json_encode is faster than serialize
  • Concatenate in loop is faster than implode

People think that @ is expensive maybe only because this saying is quite wide-spread on the web.

quoting from : http://www.php.net/manual/en/language.operators.errorcontrol.php#102543

If you’re wondering what the performance impact of using the @
operator is, consider this example. Here, the second script (using
the @ operator) takes 1.75x as long to execute…almost double the
time of the first script.

So while yes, there is some overhead, per iteration, we see that the @
operator added only .005 ms per call. Not reason enough, imho, to
avoid using the @ operator.

real 0m7.617s user 0m6.788s sys 0m0.792s


real 0m13.333s user 0m12.437s sys 0m0.836s

You can nearly unable to “overuse” an operator and it often worth if it is doing an operation you want.


foreach() statements, especially with nesting, are frequently expensive; though that’s as much my naive -and occasionally poorly-planned- approach to programming as php’s fault.

Though I think it’s true, also, of JS and other languages, so almost certainly my fault. =/


From my own experience the most expensive operation in real terms is the echo statement. Try and join all string together before outputting them to the browser, followed by database calls especially joins!

Code can also sometimes get a x10 performance increase by just simply refactoring your algorithms and data structures. Get any program and try to half its length, can you half it again?


uniqid() is stupid expensive. Don’t use to generate lots of unique identifiers.