Hi
Got a small problem with one of my users pages.
Results that he´s having massive spider attacks from rumanien sites, but all using an american spider,
called 80.legs
turns out to use up to 80 conections or more at the time, driving the server cpu use, nearly into red area.
the funny thing is, it uses ip´s from rumania etc
but we can´t off course block IP by IP, by the time you finnish with your list, he´s back with new IP´s. And look´s like theres no way to block them by spider id ( tried it, but it´s still there).
Any idea how to get rid of this ???
Edit:
Searching 80.legs website, i found This
Any idea how to include this spider into the robot block list??
By htaccess or by robot text