Over the weekend I got to mulling over the state of scanners and how
to protect one's site from them, and I got to thinking "Gee, wouldn't
it be nice if scanners obeyed some convention similar to the web's
/robots.txt to allow hosts/domains to exclude themselves from scanning"
Now I know that this won't work in the case of the determined person
with source, but it might help in the case of, say ISS or other
scanners distributed in binary form, or even in the case of things
distributed with source where the user is too dumb/lazy/ignorant to
Anybody from ISS listening ?