Optimization of PHP code via runtime benchmarking is straight forward. Keep track of $start and $end times via microtime() around a code block – I am not looking for an answer that involves microtime() usage.
What I would like to do is measure the time it takes PHP to get prepared to run it’s code – code-parse/op-code-tree-building time. My reasoning is that while it’s easy to just include() every class that you might need for every page that you have on a site, the CPU overhead can’t be “free”. I’d like to know how “expensive” parse time really is.
I am assuming that an opcode cache such as APC is not part of the scenario.
Would I be correct that measurement of parse time in PHP is something that would have to take place in mod_php?
EDIT: If possible, taking into account
$_SERVER['DOCUMENT_ROOT'] usage in code would be helpful. Command solutions might take a bit of tinkering to do this (but still be valuable answers).
Yes. There is.
<?php return; rest of the code ?>
<?php whole code; $%^parse [email protected]! ?>
and then compare time of running empty script
time php empty.php
with time it takes to run (or fail) regular script with the additions I’ve mentioned above:
time php test.php
I’ve used this method on large files and PHP5.3 on Core2Duo 2.4Ghz can parse between 1.5 and 4.5MB of PHP code per second (it depends very much on complexity of the code of course).
For the detailed level of analysis you’re seeking, it seems that implementing Xdebug (http://xdebug.org/) would give you the greatest information w/o having to chop your code into segmented pieces.
One method could be timing the execution of the script from the command line.
In Linux for example:
$ time php5 -r 'echo "Hello world";' Hello world real 0m1.565s user 0m0.036s sys 0m0.024s
Does that help at all? Perhaps it is helpful for discovering relative times between different scripts, but it may not be indicative of time taken through the Apache module