Great Circle Associates List-Managers
(March 2001)
 

Indexed By Date: [Previous] [Next] Indexed By Thread: [Previous] [Next]

Subject: Re: robots.txt
From: Tim Pierce <twp @ rootsweb . com>
Date: Thu, 1 Mar 2001 11:04:28 -0500
To: Chuq Von Rospach <chuqui @ plaidworks . com>
Cc: JC Dill <inet-list @ vo . cnchost . com>, List-Managers <list-managers @ GreatCircle . COM>
In-reply-to: <B6C2B066.5DC0%chuqui@plaidworks.com>; from chuqui@plaidworks.com on Wed, Feb 28, 2001 at 01:40:55PM -0800
References: <5.0.0.25.2.20010228115334.02e015e0@pop3.vo.cnchost.com> <B6C2B066.5DC0%chuqui@plaidworks.com>
User-agent: Mutt/1.2.5i

On Wed, Feb 28, 2001 at 01:40:55PM -0800, Chuq Von Rospach wrote:
> On 2/28/01 12:02 PM, "JC Dill" <inet-list@vo.cnchost.com> wrote:
> 
> > It's pretty hard to write a spider that can intelligently go
> > through a search interface, so most email harvester robots don't
> > try.  There's enough of the web out there that they can't spider through
> > all of it anyway.
> 
> Security through obscurity is a bad idea.

This is sort of a non sequitur.  "Security through obscurity"
traditionally means that you can't leave data lying around and just
hope people aren't going to look there.  Attempting to secure a
system by putting telnet on a funny-numbered port is an example
of security through obscurity.

By contrast, there are actual technical reasons why data behind a
HTTP POST interface is unlikely to be spidered by even an aggressive
search engine.  Conceivable, yes.  But the problem isn't solved by
running a simple port scan.

In fact, it's not unlike putting the data behind a passworded web
page.  The difference is that there are a lot of "passwords" (search
terms) which are likely to yield access.  But the problem space is
also so much larger than traditional password access that there is
no motivation for the harvesting spiders to try to solve it.

-- 
Regards,
Tim Pierce
RootsWeb.com lead system admonsterator
and Chief Hacking Officer



Follow-Ups:
Indexed By Date Previous:
From: (nil)
Next: Re: robots.txt
From: Chuq Von Rospach <chuqui@plaidworks.com>
Indexed By Thread Previous:
From: (nil)
Next: Re: robots.txt
From: Chuq Von Rospach <chuqui@plaidworks.com>

Google
 
Search Internet Search www.greatcircle.com