I could say that my question is related to PHP, but what I’m more concerned is proper logic of programming in situation where function execution can go on indefinitely.
What is the proper way of monitoring time that it takes to execute some function and how to stop this execution and go on with the rest of the program?
OK, I know that for example there is a set_time_limit() function that returns fatal error but I don’t want this, I want my code to just continue after x seconds, or maybe after time exceeded throw an exception, catch it and do something else?
Is writing some kind of a “watchdog” function solution and how is this done?
Thank you for any help that you can provide, any link, any article that addresses this problem in a way that it “should” be done.
PHP doesn’t provide a general way to timeout a function. But many components where this problem is common let you define a timeout.
The HTTP Stream Wrapper allows you to specify a
file_get_contents('http://example.com', false, stream_context_create( array('http' => array('timeout' => 10 /* seconds */)) ));
PDO (database abstraction layer) allows you to set a timeout using the
PDO::ATTR_TIMEOUTattribute (note that this attribute may mean different things with different database drivers):
$pdo->setAttribute(PDO::ATTR_TIMEOUT, 10 /* seconds */);
You can set a connection timeout when using FTP:
$ftp = ftp_connect('example.com', 21, 10 /* seconds */)
Similarly all other extensions that access potentially remote resources will provide such timeout parameters or options.
In PHP in particular, I don’t think you have a valid way to control the execution time for your functions from outside of them.
Like you said, things like
set_time_limit will throw a Fatar Error and kill your script.
I would suggest messuring your excecution time from within your functions, using things like
microtime() and throw an exception if the time limit is exeeded; an exception that you’ll be able to catch outside and continue act accordingly.