George M. Jones enscribed thusly:
> Over the weekend I got to mulling over the state of scanners and how
> to protect one's site from them, and I got to thinking "Gee, wouldn't
> it be nice if scanners obeyed some convention similar to the web's
> /robots.txt to allow hosts/domains to exclude themselves from scanning"
> Now I know that this won't work in the case of the determined person
> with source, but it might help in the case of, say ISS or other
> scanners distributed in binary form, or even in the case of things
> distributed with source where the user is too dumb/lazy/ignorant to
> make changes.
> Thoughts ?
Several and various - many relating to security policy and who's
in charge. Any sysadmin wishing to "opt out" of a scan should have the
right to do so. IN PERSON. With lots of justification. Any security
dude wanting to scan systems without warning people first should be kicked
in the fanny - except it would probably blind them! Bottom line - if your
security people and your administrators are not communicating, you have
problems that a "robots.txt" file will not solve. If its not security vs
admin, I doubt you would get anything honored anyways. (D*MN hackers just
WILL NOT COOPERATE! THE NERVE!)
> Anybody from ISS listening ?
Yes, actually we are. This particular message got a real ROTFL
as a matter of fact. Fact is, if we can get the flippen "robot.txt" file
from your system telling us to leave your system alone, we've already
got it by da balls anyways! :-) :-) Catch 22!
> ---George Jones
net in my other life at Internet Security Systems)
Michael H. Warfield | (770) 985-6132 | mhw @
(The Mad Wizard) | (770) 925-8248 | http://www.wittsend.com/mhw/
NIC whois: MHW9 | An optimist believes we live in the best of all
PGP Key: 0xDF1DD471 | possible worlds. A pessimist is sure of it!