yes, i might be a blond, but this is what i typed to budge, last night:
>Date: Fri, 15 Mar 2002 01:08:08 -0500
>To: "budge" <xxxxxx.com>
>From: Kelley <kwalker2 at gte.net>
>Subject: Re: Dicks n' Dough
>
><...>
>for whatever reason, i often have reason to use them. noticed it about a
>month ago, and just thought the box was down b/c marco said he'd had
>problems. he mentioned something about hosing a hard drive. so, the only
>thing i can imagine is that they were offline long enough that the spiders
>returned to see if the page changed but they found nothing. now that they
>think nothing is there, they won't return for awhile, not until they start
>turning up with akamai or get indexed somewhere else. but the process
>would be pretty slow, i'd imagine, since google indexed by links to a site.
><...>
>a site search which is entered as site:nuance.dhs.org search term turns up
>zilch for anything prior to 11/01
>
>i don't see any no robots txt in the html so go figger.
>"spiderciding", as you colourfully putting, involves creating a robots.txt
>page which prevents the robot from crawling the server.
well, they don't always behave, those robots.
>If a not
>insignificant number of people would like me to create a robots.txt page
>and keep the archive from being indexed by google, then they can let me
>know. Surprisingly enough, I've heard not a peep in 2 years.
>
>m.
not surprising since, 1. it's doug's list and no one can do anything about it and 2. most people don't know that you host the archives for doug, as evidenced by the frequent questions about the archive addressed to the list, not you.
maybe include a header for the site archives and your email addy, to keep it all straight there marco.
kelley