Home » Php » javascript – Setting time interval in HTML5 server sent events

javascript – Setting time interval in HTML5 server sent events

Posted by: admin July 12, 2020 Leave a comment

Questions:

I want to send regular updates from server to client. For that I used server-sent event. I’m pasting the codes below:

Client side

Getting server updates

<script>
if(typeof(EventSource)!="undefined")
{
   var source=new EventSource("demo_see.php");
   source.onmessage=function(event)
   {
      document.getElementById("result").innerHTML=event.data + "<br>";
   }
}
else
{
   document.getElementById("result").innerHTML="Sorry, your browser does not support    server-sent events...";
}
</script>
</body>
</html>

Server side

<?php
    header('Content-Type: text/event-stream');
    header('Cache-Control: no-cache');
    $x=rand(0,1000);
    echo "data:{$x}\n\n";
    flush();
?>

The code works fine but it sends updates in every 3 seconds. I want to send updates in milliseconds. I tried sleep(1) after flush() but it only increases the interval further by 1 sec. Does anyone have an Idea how I can accomplish this?

Also, can I send images using server-sent events?

How to&Answers:

As discussed in the comments above running a PHP script in an infinite loop with a sleep or a usleep is incorrect for two reasons

  • The browser will not see any event data (presumably it waits for the connection to close first) while that script is still running. I recall that early browser implementations of SSE allowed this but it is no longer the case.
  • Even if it did work browser-side you would still be faced with the issue of having a PHP script that runs excessively long (until the PHP.ini time_out settings kick in). If this happens once or twice it is OK. If there are X thousand browsers that simultaneously seek the same SSE from your server it will bring down your server.

The right way to do things is to get your PHP script to respond with event stream data and then gracefully terminate as it would normally do. Provide a retry value – in milliseconds – if you want to control when the browser tries again. Here is some sample code

function yourEventData(&$retry)
{
 //do your own stuff here and return your event data.
 //You might want to return a $retry value (milliseconds)
 //so the browser knows when to try again (not the default 3000 ms)
}

header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
header('Access-Control-Allow-Origin: *');//optional

$data = yourEventData($retry);

echo "data:{$str}\n\nretry:{$retry}\n\n";

As an answer to the original question this is a bit late but nevertheless in the interests of completeness:

What you get when you poll the server in this way is just data. What you do with it afterwards is entirely up to you. If you want to treat those data as an image and update an image displayed in your web page you would simply do

document.getElementById("imageID").src = "data:image/png;base64," + Your event stream data;

So much for the principles. I have on occasion forgotten that retry has to been in milliseconds and ended up returning, for example, retry:5\n\n which, much to my surprise, still worked. However, I would hesitate to use SSE to update a browser side image at 100ms intervals. A more typical usage would be along the following lines

  • User requests a job on the server. That job either gets queued behind other jobs or is likely to take quite a bit of time to execute (e.g. creating a PDF or an Excel spreadsheet and sending it back)
  • Instead of making the user wait with no feedback – and risking a timeout – one can fire up an SSE which tells the browser the ETA for the job to finish and a retry value is setup so the browser knows when to look again for a result.
  • The ETA is used to provide the user with some feedback
  • At the end of the ETA the browser will look again (browsers do this automatically so you need do nothing)
  • If for some reason the job is not completed by the server it should indicate that in the event stream it returns, e.g. data{"code":-1}\n\n so browser side code can deal with the situation gracefully.

There are other usage scenarios – updating stock quotes, news headlines etc. Updating images at 100ms intervals feels -a purely personal view – like a misuse of the technology.

Answer:

The reason for this behavior (message every 3 seconds) is explained here:

The browser attempts to reconnect to the source roughly 3 seconds after each connection is closed

So one way to get message every 100 milliseconds is changing the reconnect time: (in the PHP)

echo "retry: 100\n\n";

This is not very elegant though, better approach would be endless loop in PHP that will sleep for 100 milliseconds on each iteration. There is good example here, just changing the sleep() to usleep() to support milliseconds:

while (1) {
    $x=rand(0,1000);
    echo "data:{$x}\n\n";
    flush();
    usleep(100000); //1000000 = 1 seconds
}

Answer:

I believe that the accepted answer may be misleading. Although it answers the question correctly (how to set up 1 second interval) it is not true that infinite loop is a bad approach in general.

SSE is used to get updates from the server when there actually are the updates opposed to Ajax polling that constantly checks for updates (even when there are none) in some time intervals. This can be accomplished with an infinite loop that keeps the server-side script running all the time, constantly checks for updates and echos them only if there are changes.

It is not true that:

The browser will not see any event data while that script is still running.

You can run the script on the server and still sent the updates to the browser not ending the script execution like this:

while (true) {
  echo "data: test\n\n";
  flush();
  ob_flush();
  sleep(1);
}

Doing it by sending retry parameter without infinite loop will end the script and then start the script again, end it, start again… This is similar to Ajax-polling checking for updates even if there are none and this is not how SSE is intended to work. Of course there are some situations where this approach is appropriate like it’s listed in the accepted answer (for example waiting for server to create PDF and notify a client when it’s done).

Using infinite loop technique will keep the script running on the server all time so you should be careful with a lot of users because you will have a script instance for each of them and it could lead to server overload. On the other hand, the same issue would happen even in some simple scenario where you suddenly get bunch of users on the website (without SSE) or if you would using Web Sockets instead of SSE. Everything has its own limitations.

Another thing to be careful about is what you put in the loop. For example, I wouldn’t recommend putting database query in the loop that runs every second because then you’re also putting a database at risk of overloading. I would suggest using some kind of cache (Redis or even simple text file) for this case.

Answer:

SSE is an interesting technology, but one that comes with a choking side effect on implementations using APACHE/PHP backend.

When I first found out about SSE I got so excited that I replaced all Ajax polling code with SSE implementation. Only a few minutes of doing this I notice my CPU usage went up to 99/100 and the fear that my server was soon going to be brought down, forced me to revert the changes back to the friendly old Ajax polling. I love PHP and even though I knew SSE would work better on Node.is, I just wasn’t ready to go that route yet!

After a period of critical thinking, I came up with an
SSE APACHE/PHP implementation that could work without literally choking my server to death.

I’m going to share my SSE server side code with you, hopefully it helps someone overcome the challenges of implementing SSE with PHP.

<?php
/* This script fetches the lastest posts in news feed */
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");

// prevent direct access
if ( ! defined("ABSPATH") ) die("");

/* push current user in session data into global space so 
we can release session lock */
$GLOBALS["exported_user_id"] = user_id();
$GLOBALS["exported_user_tid"] = user_tid();

/* now release session lock having exported session data 
in global space. if we don't do this, then no other scripts 
will run thus causing the website to lag even when 
opening in a new tab */
session_commit();

/* how long should this connection be maintained -
while we want to wait on the server long enoug for
update, holding the connection forever burn CPU 
resources, depending on the server resources you have 
available you can tweak this higher or lower. Typically, the 
higher the closer your implementation stays as an SSE 
otherwise it will be equivalent to Ajax polling. However, an 
higher time burns CPU resource especially when there's 
more users on your website */
$time_to_stay = strtotime("1 minute 30 seconds");

/* if no data is sent, we wait 2 seconds then abort 
connection. You can use this to test when a data you 
require for script operation is not passed along. Typically 
SSE reconnects after 3 seconds */
if ( ! isset( $_GET["id"] ) ){
exit;
}

/* if "HTTP_LAST_EVENT_ID" is set, then this is a 
continue of temporily terminated script operation. This is 
important if your SSE is maintaining state you can use 
the header to get last event ID sent */ 
$last_postid = ( ( isset( 
$_SERVER["HTTP_LAST_EVENT_ID"] ) ) ? intval( 
$_SERVER["HTTP_LAST_EVENT_ID"] ) :
                                                     intval( $_GET["id"] ) );

/* keep the connection active until there's data to send to 
client */
while (true) {
/* You can assume this function perform some database
operations to get latest posts */
$data = fetch_newsfeed( $last_postid );

/* if data is not empty, we want to push back to the client 
then there must have been some new posts to push to 
client */
if ( ! empty( trim( $data ) ) ){
/* With SSE its my common practice to Json encode all 
data because I notice that not doing so, sometimes 
cause SSE to lose the data packet and only deliver a 
handful of the data on the client. This is bad since we are 
returning a structured HTML data and loosing some part 
of it will cause our HTML page to break when the data is 
inserted in our page */
$data = json_encode(array("result" => $data));

 echo "id: $last_postid \n"; // this is the lastEventID 
 echo "data: $data\n\n"; // our data
 /* flush to avoid waiting for script to terminate - make 
 sure its in the same order */
 @ob_flush(); flush(); 
}

// the amount of time that has been spent on this script
$time_stayed = intval(floor($time_to_stay) - time());
/* if we have stayed more than time to stay, then abort 
this connection to free up CPU resource */
if ( $time_stayed <= 0 ) { exit; }

/* we simply wait 5 seconds and continue again from 
start . We don't want to keep pounding our DB since we 
are in a tight loop so we sleep a few seconds and start 
from top*/
 sleep(5);
}