Dear Woo,
deaf, dumb and blind - that is what objects are currently. I very
much agree.
I have learned that people teach oo by blindfolding the pupils to
simulate objects - maybe we better introduce sensors to objects than
removing them of the pupils... ;-)
So one answer is to look at kedama, where the turtles can sniff for
high degrees of pheromones in their surroundings and change their
heading towards the highest concentration.
See the tutorials of kedama on
http://www.squeakland.org/fun_projects/kedama/kedma_welcome.htmI did a small extension for etoys once and gave them some sniffing
ability:
http://www.squeakland.org/project.jsp?http://www.emergent.de/pub/
smalltalk/squeak/projects/sniffer.003.pr
Hope that helps.
Cheers,
Markus
On Jun 13, 2006, at 9:08 AM, Woo 우한림 wrote:
> Hello everyone,
>
> I want to ask if we can program the objects
> to see each other. There is already a feature
> as "moveTowardto" but I want to compute the
> movement of the object, and i want it to see the
> other objects as robots.
>
> It's like a robot can see the ball, approach it perceiving
> to the post position. And also it sees the other robot
> and move according to the movement of the other player.
>
> I think this is the way to the AI programming.
> If we can manage the objects see each other (i mean
> knowing the x-y-position and object type) we can
> program them as realworld objects.
>
>
> Thanks for helping,
> ---
> 우한림
> 雨韓林
>
>