Home » Php » How do I limit my number of PHP hits per second? – PHP

How do I limit my number of PHP hits per second? – PHP

Posted by: admin February 22, 2020 Leave a comment

Q(Question):

I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders (of course
that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I’d like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

A(Answer):

"Nu" <[email protected]
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net…

>I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders

Even I have not dealt with this specific issue, I want help by asking these
questions:

1) What info offline downloaders bring to phpinfo():

A(Answer):

Nu <[email protected]:

I want to protect myself from if someone with a fast connection hammers
my
site. It’s not denial of service attacks, but offline downloaders (of
course
that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer
it,
it gets all the PHP files executing and overwhelms the CPU. I’d like to
be
able to after a certain amount of hits on my index.php per second, so
just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

I’d say this would have to be done on server-level, anything in PHP would
still need/eat quite some resources.

May I suggest you ask this on alt.apache.configuration?

Rik Wasmus

A(Answer):

Nu wrote:

I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders (of course
that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I’d like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

Can’t be done. You cannot control what other people on the web
do. You can only control how you react.

Any measure you take against the dishonest folks, you also take
against the honest ones. To that end, there are services out
there who will gladly charge you thousands of dollars to sell
you service packages for several thousand per month. And some
of those might even help to track down your abusive user.

But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

A(Answer):

Rik wrote:

I’d say this would have to be done on server-level, anything in PHP
would still need/eat quite some resources.

Personally, I do it by primarily serving up static HTML pages,
instead of PHP. I reserve PHP for active content and such.

You can still get hammered, but the PHP system isn’t going wild.

A(Answer):

Sanders Kaufman" <bu***@kaufman.netwrote in message
news:yW******************@newssvr27.news.prodigy.n et…

Nu wrote:

I want to protect myself from if someone with a fast connection hammers

my

site. It’s not denial of service attacks, but offline downloaders (of

course

that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer

it,

it gets all the PHP files executing and overwhelms the CPU. I’d like to

be

able to after a certain amount of hits on my index.php per second, so

just

refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

Can’t be done. You cannot control what other people on the web
do. You can only control how you react.

Any measure you take against the dishonest folks, you also take
against the honest ones. To that end, there are services out
there who will gladly charge you thousands of dollars to sell
you service packages for several thousand per month. And some
of those might even help to track down your abusive user.

But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Actually, my site goes to index.php and then index.php digs around in other
PHPs and MySQL. If I stop it right at index.php, I can keep my account from
overloading the CPU.

A(Answer):

"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:2N****************@reader1.news.saunalahti.fi …

"Nu" <[email protected]
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net…

I want to protect myself from if someone with a fast connection hammers

my

site. It’s not denial of service attacks, but offline downloaders

Even I have not dealt with this specific issue, I want help by asking

these

questions:

1) What info offline downloaders bring to phpinfo():

I don’t understand that question.

A(Answer):

"Nu" <[email protected]
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net…

>I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders

Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
– download_id
– filepath
– remote_ip
– timestamp
TROUBLEMAKERS
– remote_ip
– filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad enough
to be inserted into troublemakers table. You use some mathematics to define
characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to troublemakers
table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations in
apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.

A(Answer):

Nu wrote:

Sanders Kaufman" <bu***@kaufman.netwrote in message

>But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Actually, my site goes to index.php and then index.php digs around in other
PHPs and MySQL. If I stop it right at index.php, I can keep my account from
overloading the CPU.

In that case – you just have to choose one or more methods among
the several (labor-intensive) ones out there.

You can exit based on IP’s – but they can be spoofed. You can
exit based on other headers – but they can be spoofed, too.

This is why developers talk so much about "scalability". If
your site isn’t designed to handle peak loads, and to exit
gracefully during overload – all of the other measures won’t help.

That’s usually an OK design flaw behind a firewall, but not out
in open water.

A(Answer):

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:yp*******************@newssvr27.news.prodigy. net…

Nu wrote:

Sanders Kaufman" <bu***@kaufman.netwrote in message

But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Actually, my site goes to index.php and then index.php digs around in

other

PHPs and MySQL. If I stop it right at index.php, I can keep my account

from

overloading the CPU.

In that case – you just have to choose one or more methods among
the several (labor-intensive) ones out there.

You can exit based on IP’s – but they can be spoofed. You can
exit based on other headers – but they can be spoofed, too.

This is why developers talk so much about "scalability". If
your site isn’t designed to handle peak loads, and to exit
gracefully during overload – all of the other measures won’t help.

That’s usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds simple
enough. I can’t find out how to do that, though.

A(Answer):

I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it’s run too often
per so many seconds, to just stop and that’s enough for now. It’s not about
a complicated IP tracking thing, just a simple thing.
"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:tc****************@reader1.news.saunalahti.fi …

"Nu" <[email protected]
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net…

I want to protect myself from if someone with a fast connection hammers

my

site. It’s not denial of service attacks, but offline downloaders

Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
– download_id
– filepath
– remote_ip
– timestamp
TROUBLEMAKERS
– remote_ip
– filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad

enough

to be inserted into troublemakers table. You use some mathematics to

define

characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to

troublemakers

table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations

in

apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.

A(Answer):

I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it’s run too often
per so many seconds, to just stop and that’s enough for now. It’s not about
a complicated IP tracking thing, just a simple thing.
"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:tc****************@reader1.news.saunalahti.fi …

"Nu" <[email protected]
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net…

I want to protect myself from if someone with a fast connection hammers

my

site. It’s not denial of service attacks, but offline downloaders

Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
– download_id
– filepath
– remote_ip
– timestamp
TROUBLEMAKERS
– remote_ip
– filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad

enough

to be inserted into troublemakers table. You use some mathematics to

define

characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to

troublemakers

table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations

in

apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.

A(Answer):

Nu wrote:

"Sanders Kaufman" <bu***@kaufman.netwrote in message

>That’s usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds simple
enough. I can’t find out how to do that, though.

Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you – aborting the connection (but not the session!)
when they’re outside of your desired frequency.

Me, personally, I wouldn’t abort the connection. I’d put them
to sleep. There’s a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn’t broken, they won’t issue a
zillion connection requests. They’ll just thing you’ve got one
seriously bogged down machine.

It might even trick them into thinking they DoS’d you – when in
fact, you DoS’d them.

You can’t force people to behave any certain way on the web –
but you can trick their software!

Rule #1 of dealing with coders – don’t ask *them* for the spec.

A(Answer):

$_SESSION[] is pretty much dependant on cookies, right?

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:Ek***************@newssvr17.news.prodigy.net. ..

Nu wrote:

"Sanders Kaufman" <bu***@kaufman.netwrote in message

That’s usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds

simple

enough. I can’t find out how to do that, though.

Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you – aborting the connection (but not the session!)
when they’re outside of your desired frequency.

Me, personally, I wouldn’t abort the connection. I’d put them
to sleep. There’s a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn’t broken, they won’t issue a
zillion connection requests. They’ll just thing you’ve got one
seriously bogged down machine.

It might even trick them into thinking they DoS’d you – when in
fact, you DoS’d them.

You can’t force people to behave any certain way on the web –
but you can trick their software!

Rule #1 of dealing with coders – don’t ask *them* for the spec.

A(Answer):

Nu wrote:

I want to protect myself from if someone with a fast connection
hammers my site. It’s not denial of service attacks, but offline
downloaders (of course that don’t show they’re offline downloaders in
the useragent so I can’t filter them by that). My main issue is my
site is PHP so if they hammer it, it gets all the PHP files executing
and overwhelms the CPU. I’d like to be able to after a certain amount
of hits on my index.php per second, so just refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

You can use database for it, but it is a partial solution only.
Create MySQL table ‘requests’ with these fields:
remote_addr varchar(20)
http_via varchar(100)
http_forwarded varchar(100)
http_x_forwarded_for varchar(100)
x_http_forwarded_for varchar(100)
x_forwarded_for varchar(100)
nexttime datetime

The field names are corresponding to uppercase http header fileds except the
last field. Not all of these you can get, only remote_addr you get alvays.
At begin of your script you must try to ge these fileds as
$_SERVER["REMOTE_ADDR"], $_SERVER["HTTP_VIA"] etc.
Now you must try to search record in table where all fileds are the same.
If you found record you must compare if current time is equial or greter
then value stored in nexttime field.
If current time is less then stored then you can show some error message or
redirect to www.microsoft.com 🙂
If current time is equial or greater then you display requested page.

At the end of your script you must
1) update nexttime field (store current time + some addition when user can
access page again) if you found record at begin of script

2) or create new record when you not found record at script begin.

Petr Vileta, Czech republic
(My server rejects all messages from Yahoo and Hotmail. Send me your mail
from another non-spammer site please.)

A(Answer):

"Nu" <[email protected] in message
news:uE********************@bgtnsc05-news.ops.worldnet.att.net…

I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders (of

course

that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer

it,

it gets all the PHP files executing and overwhelms the CPU. I’d like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

I’ve heard this called a "hit limit".

A(Answer):

>
Me, personally, I wouldn’t abort the connection. I’d put them to sleep.
There’s a sleep() function in PHP that will let you pause the processing
for a period of time. (You might want to build a wrapper around it for
your own sleepy purposes.)

I wish there was a real life wrapper for this… (no doona jokes please) 🙂

A(Answer):

Nu wrote:

$_SESSION[] is pretty much dependant on cookies, right?

Yes – optionally a query string, but that won’t work for your
purposes.

If you want to monitor the activity of your clients from one
connection to the next, you need persistant client-side data.

There’s no way around that… except maybe the Honor system.

A(Answer):

Petr Vileta wrote:

Nu wrote:

>I want to protect myself from if someone with a fast connection
hammers my site. It’s not denial of service attacks, but offline
downloaders (of course that don’t show they’re offline downloaders in
the useragent so I can’t filter them by that). My main issue is my
site is PHP so if they hammer it, it gets all the PHP files executing
and overwhelms the CPU. I’d like to be able to after a certain amount
of hits on my index.php per second, so just refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

You can use database for it, but it is a partial solution only.

Tee hee. The idea was to *prevent* that kind of activity.

A(Answer):

On Wed, 31 Jan 2007 17:37:57 -0800, Nu <[email protected]:

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:Ek***************@newssvr17.news.prodigy.net. ..

>Nu wrote:

"Sanders Kaufman" <bu***@kaufman.netwrote in message

>That’s usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds

simple

enough. I can’t find out how to do that, though.

Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you – aborting the connection (but not the session!)
when they’re outside of your desired frequency.

Me, personally, I wouldn’t abort the connection. I’d put them
to sleep. There’s a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn’t broken, they won’t issue a
zillion connection requests. They’ll just thing you’ve got one
seriously bogged down machine.

It might even trick them into thinking they DoS’d you – when in
fact, you DoS’d them.

You can’t force people to behave any certain way on the web –
but you can trick their software!

Rule #1 of dealing with coders – don’t ask *them* for the spec.

$_SESSION[] is pretty much dependant on cookies, right?

No, sessions will work with the query string when session cookies can’t be
set. Sanders Kaufman’s idea seems pretty sound, in that it uses sleep. I
like it.


Curtis, http://dyersweb.com

A(Answer):

Curtis wrote:

No, sessions will work with the query string when session cookies can’t
be set. Sanders Kaufman’s idea seems pretty sound, in that it uses
sleep. I like it.

The only problem with it is if you go cookieless.

In the query string way, you’d need the user to type in the
query string when they go back to the page in order to retain
the session data.

A(Answer):

On Wed, 31 Jan 2007 20:48:19 -0800, Sanders Kaufman <bu***@kaufman.net>
wrote:

Curtis wrote:

>No, sessions will work with the query string when session cookies can’t
be set. Sanders Kaufman’s idea seems pretty sound, in that it uses
sleep. I like it.

The only problem with it is if you go cookieless.

In the query string way, you’d need the user to type in the query string
when they go back to the page in order to retain the session data.

Yeah, that’s true, but if they’re navigating within the site, PHP will (if
enabled in php.ini) append the SID to the end of links and form actions
transparently.


Curtis, http://dyersweb.com

A(Answer):

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:xO******************@newssvr21.news.prodigy.n et…

Nu wrote:

$_SESSION[] is pretty much dependant on cookies, right?

Yes – optionally a query string, but that won’t work for your
purposes.

If you want to monitor the activity of your clients from one
connection to the next, you need persistant client-side data.

There’s no way around that… except maybe the Honor system.

Cookies wouldn’t work on some email grabber bot or offline downloader. And a
query string isn’t how my site’s software works.

I’m basically looking for something that simply would be able to say
index.php can’t be called by anyone more than say so many times per second.
After that, it’ll do a sleep command or output a "too many connections
page."

A(Answer):

$currentmin = mktime(date("H") , date("i"), 0, date("m"), date("d"),
date("Y"));

if(is_file($currentmin)) {

/*
get the content of the file

if content of file 10 (max limit per minute)
display some message/redirect …

else
increment the value of the content read from the file and write it
back to the file

process MySQL operations …

*/
} else {
//create a file $currentmin

//write the content "1" to the file

}
//you will need to delete the older $currentmin files by some scripts

On Feb 1, 4:30 am, "Nu" <n…@spam.comwrote:

I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders (of course
that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I’d like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

A(Answer):

"Manish" <ye*********@gmail.comwrote in message
news:11**********************@v33g2000cwv.googlegr oups.com…

>
$currentmin = mktime(date("H") , date("i"), 0, date("m"), date("d"),
date("Y"));

if(is_file($currentmin)) {

/*
get the content of the file

if content of file 10 (max limit per minute)
display some message/redirect …

else
increment the value of the content read from the file and write it
back to the file

process MySQL operations …

*/

} else {
//create a file $currentmin

//write the content "1" to the file

}

//you will need to delete the older $currentmin files by some scripts

That idea will work. Thanks.

A(Answer):

Nu wrote:

I’m basically looking for something that simply would be able to say
index.php can’t be called by anyone more than say so many times per second.

See – that’s where your spec is flawed. You want to act based
on a visitor’s identity… without identifying visitors.

After that, it’ll do a sleep command or output a "too many connections
page."

A(Answer):

Manish wrote:

//create a file $currentmin
//write the content "1" to the file

//you will need to delete the older $currentmin files by some scripts

The only problem with that is that *preventing* such processing
was the primary goal.

This "solution" guarantees a whole extra level of creating and
deleting files to every single page.

A(Answer):

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:%n****************@newssvr17.news.prodigy.net …

Nu wrote:

I’m basically looking for something that simply would be able to say
index.php can’t be called by anyone more than say so many times per

second.

>
See – that’s where your spec is flawed. You want to act based
on a visitor’s identity… without identifying visitors.

No I want something based on a filename, not visitors.

A(Answer):

Nu wrote:

"Sanders Kaufman" <bu***@kaufman.netwrote in message

>>I’m basically looking for something that simply would be able to say
index.php can’t be called by anyone more than say so many times per

second.

>See – that’s where your spec is flawed. You want to act based
on a visitor’s identity… without identifying visitors.

No I want something based on a filename, not visitors.

Oh – I took the "by anyone" literally.

No flame – but this stuff gets a LOT easier when you state the
problem correctly and completely. That’s why it’s called a
specification.

A(Answer):

Following on from Nu’s message. . .

>I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it’s run too often
per so many seconds, to just stop and that’s enough for now. It’s not about
a complicated IP tracking thing, just a simple thing.

OK.[1] You simply want to say "I will only allow index.php to be run
10(etc) times per second."
(1) I don’t know if there is an Apache way to do this so I’ll pass on
that one.
(2) If you can’t stop the page running you can stop it doing lots of
complicated work by sharing the data across all sessions for which you
need to persist in a shared resource such as a database or mutex file.
(3) There are various things you can do in the "do I allow another
proper execution or fail[2]"
– Every 100 runs reset mutex to 0 // stops errors building up to
wedge
the system.
– Check a counter of how many are ‘in progress’
– If OK then add 1 to mutex and do real work or fail
– At end subtract 1 from mutex[3] [1] Personally I think it is better to get to the root cause of the
problem and if you can’t ask people ‘not to do that’ then to enforce it
/for those people/. If you’re the national rail site that falls over
when there’s a bit of snow and everyone wants to see what a shambles the
railways are in then get more server power.

[2] You need to give very careful thought to how you ‘fail’. Delay, 500
message, bandwidth exceeded graphic, absolutely no output, Forbidden?,
redirect to static data in a HTML page …

[3] Notice I have bent your spec in this scheme and looked at the number
of sessions currently running rather than an arbitrary time between
calls.

PETER FOX Not the same since the bolt company screwed up
pe******@eminent.demon.co.uk.not.this.bit.no.html
2 Tees Close, Witham, Essex.
Gravity beer in Essex <http://www.eminent.demon.co.uk>

A(Answer):

"Nu" <[email protected] in message
news:uE********************@bgtnsc05-news.ops.worldnet.att.net…

I want to protect myself from if someone with a fast connection hammers my
site. It’s not denial of service attacks, but offline downloaders (of

course

that don’t show they’re offline downloaders in the useragent so I can’t
filter them by that). My main issue is my site is PHP so if they hammer

it,

it gets all the PHP files executing and overwhelms the CPU. I’d like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

It was faster than I thought to make and I got code running now throttling
my site. I will never have to have an offline downloader trying to crash the
server.\

A(Answer):

Nu wrote:

I can’t find how to do that. Can it be done in PHP, htaccess, etc.

Firstly, are you on a shared host? If so, then ensuring quality of
service is really your hosting provider’s job. If one site on a server is
using up a massive portion of the server’s capacity (with regards to
bandwidth, CPU, memory or disk space) then this impacts *all* the sites on
that particular server, so it’s their responsiblity to either throttle that
site or request that its administrator purchases a more expensive hosting
package so that it can be moved onto a server with fewer other sites.

If you’re on a dedicated host and don’t have root access, then get root
access (change your hosting provider if need be).

If you’re on a dedicated host with root access, then probably the best
option is to use Apache’s mod_cband module <http://mod-cband.com/>. With
this you can add to your httpd.conf something like this:

<VirtualHost>

# limit speed of this vhost to 1Mbit/s, 10 request/s, 30 open connections
CBandSpeed 1Mbps 10 30
# in addition every remote host connecting to this vhost
# will be limited to 100kbit/s, 3 request/s, 3 open connections
CBandRemoteSpeed 100kbps 3 3
</VirtualHost>


Toby A Inkster BSc (Hons) ARCS
Contact Me ~ http://tobyinkster.co.uk/contact
Geek of ~ HTML/CSS/Javascript/SQL/Perl/PHP/Python*/Apache/Linux

* = I’m getting there!

A(Answer):

Following on from Nu’s message. . .

>
It was faster than I thought to make and I got code running now throttling
my site. I will never have to have an offline downloader trying to crash the
server.\

Hold on! If all you’ve done is throttled everyone then offline
downloaders are effectively DOS. They still bash away. It’s like they
put 5 people in the queue for every one ordinary user.

If you consider these events are rare then perhaps OK, but otherwise
you’re shutting out everyone. In fact the ‘correct’ response from a bot
is to *increase* the fire rate to get an acceptable percentage of hits.
i.e this is as much an issue of equitable rationing as limiting your
server usage.


PETER FOX Not the same since the bolt company screwed up
pe******@eminent.demon.co.uk.not.this.bit.no.html
2 Tees Close, Witham, Essex.
Gravity beer in Essex <http://www.eminent.demon.co.uk>