I think, you should define rules for bots in robots.txt file, and put
there anything which you want robots to index, and what's not.
See
http://www.robotstxt.org/robotstxt.htmlOn 29 January 2010 11:14, Edgar J. De Cleene <
[hidden email]> wrote:
> Folks:
>
> This days I working on HVNaughtieWiki , send info for feedback to squeakros
> and to aida list.
>
> Besides some squeakers looking this work in progress, this morning I have
> the attention of Google
>
> This is the print string of request.
>
> HttpRequest (URL=/file/squeak.sts; protocol=HTTP/1.1; header=a
> Dictionary('accept'->'*/*' 'accept-encoding'->'gzip,deflate'
> 'connection'->'Keep-alive' 'from'->'googlebot(at)googlebot.com'
> 'host'->'190.193.89.80:8085' 'user-agent'->'Mozilla/5.0 (compatible;
> Googlebot/2.1; +
http://www.google.com/bot.html)' ); getFields=a
> Dictionary(); postFields=a Dictionary())
>
> All is Comanche + HV2 and some on top.
>
> How deal with this?
>
> Refuse the connect ?
> Send a error page?
>
> Any hints is welcomed
>
> Edgar
>
>
>
--
Best regards,
Igor Stasenko AKA sig.