Well I looked at my code for stopping unlimited access to the scans and oh dear... dunno what I was thinking when I wrote it but I didn't think it out very clearly.
Looking at it now, it's obvious how web crawlers were able to get past it easily.
But no more!!
I even tried a web crawler to try and grab all the scans (where most of the bandwidth has been going to) and all I get back are lots of pages saying the same error message over and over
That should sort the bastards out!! Now going to edit my server config files again to block other stuff
p.s. if you have any problems viewing scans with the new code, please let me know! I have set a limit of 60 scan views for every 4 hours.
Offline Website grabbers / broswers
- Professor Brian Strain
- King of Zzap Towers
- Posts: 787
- Joined: Fri Apr 02, 2004 2:22 pm
- Location: Skegness, UK
- Contact: