Impressive, soon-to-come brain control headset

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
20 messages Options
Reply | Threaded
Open this post in threaded view
|

Impressive, soon-to-come brain control headset

Florent THIERY-2
Hi,

Sorry to disturb this peaceful mailing list, just figured this can be
interesting:

"The Emotiv EPOC uses a set of sensors to tune into electric signals
naturally produced by the brain to detect player thoughts, feelings
and expression. It connects wirelessly with all game platforms from
consoles to PCs. The Emotiv neuroheadset now makes it possible for
games to be controlled and influenced by the player's mind."

http://emotiv.com/corporate/2_0/2_1.htm

éif the player smiles, winks, grimaces the headset can detect the
expression and translate it to the avatar in game."
"The $299 headset has a gyroscope to detect movement and has wireless
capabilities to communicate with a USB dongle plugged into a
computer."
"The Emotiv said the headset could detects more than 30 different
expressions, emotions and actions."
"The headset could be used to improve the realism of emotional
responses of AI characters in games
"They include excitement, meditation, tension and frustration; facial
expressions such as smile, laugh, wink, shock (eyebrows raised), anger
(eyebrows furrowed); and cognitive actions such as push, pull, lift,
drop and rotate (on six different axis). "

http://news.bbc.co.uk/1/hi/technology/7254078.stm

Regards,

Florent
Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

Darius Clarke
I think fix8 might be better for now. Only needs a web cam.

http://fix8.com/
Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

Janet Plato
I think fix8 appears to be solving a different problem.  Emotiv
appears to be reading EEG information for the purpose of allowing
people to control keyboards and mice, fix8 appears to be for
automating the mirroring of an avatars face, based on images taken of
a user.

Assuming emotive receives EEG information and uses traditional
methods, you buy and wear the headset, and then work with the training
software to learn what specific mental actions can cause measurable
inputs, and then you map those inputs to events you care about such as
keystrokes or mouse movements.  The hardware is not particularly
complex (differential voltage amplifiers attached to electrodes in
contact with the scalp) but if they succeed in making easy to use
software, this could be very transformative.  Mice and keyboards are
not very usable as inputs for wearable computers, and the disabled
have trouble with mice and keyboards as well.

On Wed, Feb 20, 2008 at 3:47 PM, Darius Clarke <[hidden email]> wrote:
> I think fix8 might be better for now. Only needs a web cam.
>
>  http://fix8.com/
>

Janet
Reply | Threaded
Open this post in threaded view
|

RE: Impressive, soon-to-come brain control headset

Paul Sheldon-2
In reply to this post by Florent THIERY-2
I have eyeglass monitor without gyroscope. Haven&#39;t checked it as stereo with interleave to each eye.

It has no gyroscope. I might script mayaple to see
how difficult it was to slew in VR based on head turning.

The cost of my eyeglass monitors becomes $10,000 with gyro.

Basically I have an app that controls shutterglasses
which eye sees and at same time shifts camera pov with AppleScript into maya vr melcript.

Now, I would have to use their sdk to have a gyro reading
aim camera. It might do with new eye monitors even without split interleave.

Sorry I can&#39;t click link with iPhone to see whether mac sdk.

I don&#39;t know how hard it would be to make device interface in squeak.

There&#39;s a guy who posted about an intelligent webcam
that could control avatar expression,
but long ago I wrote Carolyn Rose about mind control
of computers for her carpal tunnel mailing list.

Very very interesting, I am wondering about getting
pixel interleve by learning to jerk my head around
to get hi res vr. Would that work?

If I spend $300 I must get at programming
to amortize the investment. I&#39;ve been shirking
reading Americo book 2. Vr is stepping stone to AI
for his writing. But, there is nothing like immersion
to stimulate any kind of intelligence a

Thanks for the post.

Florent wrote:

> Hi,
> Sorry to disturb this peaceful mailing list, just figured this can be
> interesting:
> "The Emotiv EPOC uses a set of sensors to tune into electric signals
> naturally produced by the brain to detect player thoughts, feelings
> and expression. It connects wirelessly with all game platforms from
> consoles to PCs. The Emotiv neuroheadset now makes it possible for
> games to be controlled and influenced by the player's mind."
> http://emotiv.com/corporate/2_0/2_1.htm
> éif the player smiles, winks, grimaces the headset can detect the
> expression and translate it to the avatar in game."
> "The $299 headset has a gyroscope to detect movement and has wireless
> capabilities to communicate with a USB dongle plugged into a
> computer."
> "The Emotiv said the headset could detects more than 30 different
> expressions, emotions and actions."
> "The headset could be used to improve the realism of emotional
> responses of AI characters in games
> "They include excitement, meditation, tension and frustration; facial
> expressions such as smile, laugh, wink, shock (eyebrows raised), anger
> (eyebrows furrowed); and cognitive actions such as push, pull, lift,
> drop and rotate (on six different axis). "
> http://news.bbc.co.uk/1/hi/technology/7254078.stm
> Regards,
> Florent

Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

Howard Stearns-3
In reply to this post by Janet Plato
The way that I think of fix8 (and Logitech orbitcam) as being  
comparable to emotiv is that they all make use of a lot of user-input  
information that is otherwise being thrown away.  The camera stuff  
doesn't happen to be doing anything semantic with those gestures  
right now, but they are capturing stuff.

Emotiv can potentially capture more (and for more people, and in more  
physical contexts), but then again, optical gesture tracking is  
working now and doesn't require special hardware nor have anything  
touch you.


On Feb 20, 2008, at 4:57 PM, Janet Plato wrote:

> I think fix8 appears to be solving a different problem.  Emotiv
> appears to be reading EEG information for the purpose of allowing
> people to control keyboards and mice, fix8 appears to be for
> automating the mirroring of an avatars face, based on images taken of
> a user.
>
> Assuming emotive receives EEG information and uses traditional
> methods, you buy and wear the headset, and then work with the training
> software to learn what specific mental actions can cause measurable
> inputs, and then you map those inputs to events you care about such as
> keystrokes or mouse movements.  The hardware is not particularly
> complex (differential voltage amplifiers attached to electrodes in
> contact with the scalp) but if they succeed in making easy to use
> software, this could be very transformative.  Mice and keyboards are
> not very usable as inputs for wearable computers, and the disabled
> have trouble with mice and keyboards as well.
>
> On Wed, Feb 20, 2008 at 3:47 PM, Darius Clarke <[hidden email]>  
> wrote:
>> I think fix8 might be better for now. Only needs a web cam.
>>
>>  http://fix8.com/
>>
>
> Janet

Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

deadgenome -.,.-*`*-.,.-*`*-
I have been thinking about using non-invasive myoelectrics in bands on
the forearms to detect the electrical impulses of the muscles that
move the fingers. AFAIK, this technology works even when there is no
perceptable movement, which therefore is effectively thought control.
NASA have also developed a system that does this for the throat
muscles, so they can pick up what they refer to as subvocalisation,
enabling the reading of strong verbal thought (very useful in noisy
environments such as rockets during launch).

When looking into this technology, one description of the arrangement
of electrodes that I read was having two silver bars, 10mm long and
1mm in diameter, 10mm apart and placed on the skin above the muscle
being read, perpendicular to the alignment of the muscle fibers.

The reason you have two bars is that the electrical noise on the skin
is very high compared to the muscle signal you are trying to measure,
but this noise is very similar for two points 10mm apart, whereas the
muscle signal will be different. So what you are doing is using an
op-amp to subtract the signal from one bar from the signal from the
other, so that you are left with the difference between the muscle
signals for those two points.

It was also mentioned that it is best to have the opamp (and a second
amplifying opamp) on the actual electrode patch attached to the skin,
rather than at the end of some long wires, so as to reduce capacitance
noise. This sounds very, very cheap.

A combination of finger and throat muscle myoelectrics should give you
a very cheap and flexible system that requires virtually no training
to use and is thought control in all but name.

A more down to earth look at input devices is being done by a friend
of mine called Sam who has been developing a weird controllery thing
called jedipad which is quite similar in many respects to the
wii-remote - http://jpad.wikispaces.com/ - it is a hand held device
covered in trackpoints and contains gyros, accellerometers, a squidgy
pressure sensor thingy and a microphone.

On 20/02/2008, Howard Stearns <[hidden email]> wrote:

> The way that I think of fix8 (and Logitech orbitcam) as being
>  comparable to emotiv is that they all make use of a lot of user-input
>  information that is otherwise being thrown away.  The camera stuff
>  doesn't happen to be doing anything semantic with those gestures
>  right now, but they are capturing stuff.
>
>  Emotiv can potentially capture more (and for more people, and in more
>  physical contexts), but then again, optical gesture tracking is
>  working now and doesn't require special hardware nor have anything
>  touch you.
>
>
>
>  On Feb 20, 2008, at 4:57 PM, Janet Plato wrote:
>
>  > I think fix8 appears to be solving a different problem.  Emotiv
>  > appears to be reading EEG information for the purpose of allowing
>  > people to control keyboards and mice, fix8 appears to be for
>  > automating the mirroring of an avatars face, based on images taken of
>  > a user.
>  >
>  > Assuming emotive receives EEG information and uses traditional
>  > methods, you buy and wear the headset, and then work with the training
>  > software to learn what specific mental actions can cause measurable
>  > inputs, and then you map those inputs to events you care about such as
>  > keystrokes or mouse movements.  The hardware is not particularly
>  > complex (differential voltage amplifiers attached to electrodes in
>  > contact with the scalp) but if they succeed in making easy to use
>  > software, this could be very transformative.  Mice and keyboards are
>  > not very usable as inputs for wearable computers, and the disabled
>  > have trouble with mice and keyboards as well.
>  >
>  > On Wed, Feb 20, 2008 at 3:47 PM, Darius Clarke <[hidden email]>
>  > wrote:
>  >> I think fix8 might be better for now. Only needs a web cam.
>  >>
>  >>  http://fix8.com/
>  >>
>  >
>  > Janet
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

Florent THIERY-2
To me, the most impressive part of this product it his
*price*/features potential. If a real community enables around it, it
may well become a killer, wide-spread product.

Same thing as the wiimote, which inself really is a high-tech VR
product for a very low price: a high-res infrared captor (1024*768),
with embedded near-real-time image processing engine. Not to mention
accelerometers, and easy to use/parse data. No wonder why so many libs
have been implemented.

Sadly, there's no such thing as affordable, high-res HMDs yet.

-FLo
Reply | Threaded
Open this post in threaded view
|

Eyetracking VS BCI (was: Impressive, soon-to-come brain control headset)

Aaron Brancotti
In reply to this post by deadgenome -.,.-*`*-.,.-*`*-

Hi all from Italy and from a list newby (but a long-time addicted VR
developer, I was working with W-Industries Virtuality stuff back in 1992),

just my 2 cents:

sure using directly electrical signals from our nervous system is
fascinating, but I think it is still too cumbersome. And the EPOC and
BCI (Brain Computer Interface) stuff, YES, you can do that, and it
somewhat works, but not right out of the box. You need HW, and you must
wear it, and you need training, and...
> A more down to earth look at input devices is being done by a friend
> of mine called Sam [snip] and contains gyros, accellerometers, a squidgy
> pressure sensor thingy and a microphone.
>  
your friend's HW work is great, but it will not be a question of just
hardware alone. Sure, some low-cost , feature-packed stuff like that (or
the wiimote) can help in making a computer more USABLE (keep in mind
this word! :) ), but IMHO the real shift will come from non-intrusive,
"natural" technologies like eyetracking and voice-recognition (as your
friend states at the bottom of the page). And in this, I see Fix8 much
more interesting than other do, it seems:
>> The way that I think of fix8 (and Logitech orbitcam) as being
>>  comparable to emotiv is that they all make use of a lot of user-input
>>  information that is otherwise being thrown away.  The camera stuff
>>  doesn't happen to be doing anything semantic with those gestures
>>  right now, but they are capturing stuff.
>>    
well, you are probably right, meaning that Fix8 probably just does some
feature recognition (eyes position, nose position etc) and translates
1:1 the movements of those features on a 3D model, but applying
semantics is quite near to that...

>>  > I think fix8 appears to be solving a different problem.  [snip]  Mice and keyboards are
>>  > not very usable as inputs for wearable computers, and the disabled
>>  > have trouble with mice and keyboards as well.
>>    
some kind of disabilities, like lateral amiotrophic sclerosis, would
pose problems even for electrodes-based stuff like the EPOC: such kind
of disabilities completely blocks your body, and even wearing something
on your head would be cumbersome (you must always rest your head),
without mentioning that it would require specialized knowledge and
assistance and IMHO could be "scary" for a lot of people. On the other
hand, technologies like eyetracking are completely unobtrusive and can
be used effectively (with a COMPLETE REDESIGN of the user interface,
obviously) to control a computer with just your eyes. The company for
which I work develops and sell a whole SW suite comprising a mail
client, a web browser (mozilla-based), some specialized on-screen
keyboards, a Skype client, vocal synthesis, an e-book reader, a small
"write" app, a simplified "file system" and some more stuff. Problem now
is the HW (the eyetracker), which is VERY expensive, because the makers
are market leaders. But the tecnology itself is no secret, it is just
they were first and are still the best and have the best algorithms
since now. In different declinations, using eyetracking+voice
recognition to achieve multimodal interaction is ALREADY an impressive
application, and IMHO is absolutely superior to neurological/biofeedback
solutions right now. I hardly figure someone able to write an email and
surf the Web with just a BCI device and ten minutes setup and training.

In this scenario, Fix8 will NOT even bore the professional eyetracking
hardware market with a 300$ SW working on a webcam, but they are
developing knowledge in a very promising direction. After all,
eyetracking is a matter of models, and filters, and fast image
recognition, and maths... MAYBE a day they will develop some low-cost
stuff able to do something similar to what you now have to pay 20K$ to
get done. We are used to see software making miracles, are we not?

Well, on the other hand, the last frontier of neurological approach is
that Matrix jack behind your neck, sure... but I will NOT beta-test it. :)

Aaron Brancotti aka Babele Dunnit


PS: I re-descovered the Croquet environment after some years, I read the
FAQ and Overview and I immediately joined the list... men, you are doing
it right. Hats off.

My two cents also about virtual worlds (so, this sums up to FOUR cents..
you getting rich, huh?)

http://www.nextverso.org/?p=6

Me and my friend wrote this a week BEFORE I re-descovered Croquet and I
saw how many steps forward you took. I will definitely post something on
NextVerso about Croquet, and why Croquet is the right way to go to clear
our cyberspaces from that messy SecondLife stuff and build something
really good.






Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

Paul Sheldon-2

--- Aaron Brancotti <[hidden email]> wrote
comparison of eyetracking vs BCI where I thought
putting them
together might be interesting.

I think $300 for just the gyro is good? Is that right?

If camera pointing in VR copied eyetracking then the
limited
pixels of lo res HMD could scan through VR and you
might
get the impression of hi res HMD.

If the eye tracker could be head mounted, then the
user
could afford to turn his head and still be on the
tracker camera.
Then, with that added degree of freedom, for the hi
res to work,
you would also need a gyro to see how the head turned
as well.
Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset, etc....

Howard Stearns-3
In reply to this post by Florent THIERY-2
Very nice low budget do-it-yourself head tracking by Johnny Lee at  
CMU: http://youtube.com/watch?v=Jd3-eiid-Uw


Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

Aaron Brancotti
In reply to this post by Paul Sheldon-2

hi psheldon,
> [hidden email] said:
> If camera pointing in VR copied eyetracking then the
> limited
> pixels of lo res HMD could scan through VR and you
> might
> get the impression of hi res HMD. [snip]
>  

I have not understood fully... anyway, "commercial" HMD FoV is so small
that eyetracking inside it is nearly useless. When wearing an HMD you
turn your head to look around, you don't use your eyes movement
(nearly). Think at what you do when wearing a diving mask... if you use
your eyes to look around, you see the inside of the mask!

and I dont understand how you could multiply the perceived resolution of
the HMD... hardware pixels are hardware pixels.

ciao
Aaron
Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

deadgenome -.,.-*`*-.,.-*`*-
you move the projection source to physically match the direction your
eye is looking... off the top of my head I can think of several
arrangements of optics that could accomplish this.

On 21/02/2008, Aaron Brancotti <[hidden email]> wrote:

>
>  hi psheldon,
>
> > [hidden email] said:
>  > If camera pointing in VR copied eyetracking then the
>  > limited
>  > pixels of lo res HMD could scan through VR and you
>  > might
>
> > get the impression of hi res HMD. [snip]
>  >
>
>  I have not understood fully... anyway, "commercial" HMD FoV is so small
>  that eyetracking inside it is nearly useless. When wearing an HMD you
>  turn your head to look around, you don't use your eyes movement
>  (nearly). Think at what you do when wearing a diving mask... if you use
>  your eyes to look around, you see the inside of the mask!
>
>  and I dont understand how you could multiply the perceived resolution of
>  the HMD... hardware pixels are hardware pixels.
>
>  ciao
>
> Aaron
>
Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

Paul Sheldon-2

--- "deadgenome -.,.-*`*-.,.-*`*-"
<[hidden email]> wrote:

> you move the projection source to physically match
> the direction your
> eye is looking...
Yes that is what I meant to try to convey or instruct.
> off the top of my head I can think
> of several
> arrangements of optics that could accomplish this.
>
Fantastic. I think you understand the idea I was
trying to convey.
The eye would track the image that was behind the
sampling screen
and the pixels would interleave precluding the need
for high resolution
sampling screen by time multiplexing of interleaved
pixels on retina
if you want to think about retinas or more difficult
stuff
if you want to think of vision systems in the brain
like Bell Labs did.
Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

deadgenome -.,.-*`*-.,.-*`*-
If you know of some of the single fibre endoscopes that work by
vibrating the fibre in a scan pattern  using either piezos or a
charged/magnetised fibre combined with electromagnets (I believe they
probably use piezos, as the latter would have problems around other
medical apparatus, such as MRI scanners) then you are close to the
ideas I have been looking at.

I was thinking that a combination of overlaid signals, one being the
simple scan pattern and the other being driven by the input from the
eye tracking could be used to control a vibrating fibre mounted at the
side or above/below the eye, that is then directed at a curved
reflector in front of the eye. This would also need a very, very fast
light modulator on the other end of the fibre, or alternatively to
have it being vibrated again, thus scanning it across a DLP or
something similar in a base unit that could be worn on the belt. A
more bulky/less complex system could be done by having a fibre bundle,
but I prefer the single fibre approach.

This configuration also makes the head mounted parts be lightweight
and the whole system easy to make shock-proof and water-proof, which
would make it useful in augmented reality systems as well as VR ones.

Another idea was that if the fibre used was capable of transmitting
infrared, then IR could be sent up the fibre from the base unit and
any reflected IR could be analysed to work out the location of the
pupil. This would work by the fibre being first sent on a search
pattern to locate the pupil and then once locked on, the reflected IR
being constantly analysed to work out where the eye is moving, much
like a heat seeking missile.

Legal Note - The content and concepts conveyed in this email are
covered by the latest version of the Gnu GPL -
http://www.gnu.org/copyleft/gpl.html

P.S. If anyone builds one of these before I manage to, I would love to
help test the prototypes ;)
Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

Les Howell
In reply to this post by Paul Sheldon-2
On Thu, 2008-02-21 at 09:50 -0800, PAUL SHELDON wrote:

> --- "deadgenome -.,.-*`*-.,.-*`*-"
> <[hidden email]> wrote:
>
> > you move the projection source to physically match
> > the direction your
> > eye is looking...
> Yes that is what I meant to try to convey or instruct.
> > off the top of my head I can think
> > of several
> > arrangements of optics that could accomplish this.
> >
> Fantastic. I think you understand the idea I was
> trying to convey.
> The eye would track the image that was behind the
> sampling screen
> and the pixels would interleave precluding the need
> for high resolution
> sampling screen by time multiplexing of interleaved
> pixels on retina
> if you want to think about retinas or more difficult
> stuff
> if you want to think of vision systems in the brain
> like Bell Labs did.
Yet another option is shading to move the perspective from one pixel to
the next.  Think of a black line running at some arbitrary angle.  When
the line is at precisely 45 degrees (or whatever is appropriate for the
ratio of length to height of pixels), the line is directly drawn.  If
the line runs at some other angle, the pixels on either side are shaded
approprately to the gamma factor and shade approprate to interperolate
the actual line.  Although the actual hardware resolution doesn't
change, the eye perceives a smoother and more "real life" line rather
than a  pixelated image.

In 3d this is generally accomplished by calculating the radiance factor
given the relative angles of each facet contributing to the edge.  A
similar effect is used to smooth the compound curves for pixelated 3d
images.

There is no reason this could not be used with coarser displays (reduced
pixel counts).  But today, with small screens on PDA's and other
devices, it won't be long before head mounted displays can be megapixel,
full color and quite small.  

        One use is via a DMD IC (produced by TI) mounted in the frame of
eye-wear that will project an image onto the inside of the lens (first
surface).  This surface could be coated to reduce reflective
interference with the real world, and it would be possible to overlay VR
onto the real world.  Think about a home walk through where your
furniture could be featured in the rooms as you walk through.  Clothing
that could be shown on your body via a VR mirror, or other commercial
applications.

        A means of achieving polar coordinates appropriate to the room could be
a light bar on some wall, viewed by a camera in the headset, similar to
the WII.

        How's that for a good application?

Regards,
Les H

Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

deadgenome -.,.-*`*-.,.-*`*-
Les, for your first point, what's the difference to the sub-pixel
shading technology that has been available on all main OS's for lcd
screens for a while now, or is that what you are describing, just
extended to 3d environments as well as fonts?

On uses of AR, the augmented memory, navigation, education, design and
arts implications are more my bag, though the marketing/fashion
implications may make more money.

On 21/02/2008, Les <[hidden email]> wrote:

> On Thu, 2008-02-21 at 09:50 -0800, PAUL SHELDON wrote:
>  > --- "deadgenome -.,.-*`*-.,.-*`*-"
>  > <[hidden email]> wrote:
>  >
>  > > you move the projection source to physically match
>  > > the direction your
>  > > eye is looking...
>  > Yes that is what I meant to try to convey or instruct.
>  > > off the top of my head I can think
>  > > of several
>  > > arrangements of optics that could accomplish this.
>  > >
>  > Fantastic. I think you understand the idea I was
>  > trying to convey.
>  > The eye would track the image that was behind the
>  > sampling screen
>  > and the pixels would interleave precluding the need
>  > for high resolution
>  > sampling screen by time multiplexing of interleaved
>  > pixels on retina
>  > if you want to think about retinas or more difficult
>  > stuff
>  > if you want to think of vision systems in the brain
>  > like Bell Labs did.
>
> Yet another option is shading to move the perspective from one pixel to
>  the next.  Think of a black line running at some arbitrary angle.  When
>  the line is at precisely 45 degrees (or whatever is appropriate for the
>  ratio of length to height of pixels), the line is directly drawn.  If
>  the line runs at some other angle, the pixels on either side are shaded
>  approprately to the gamma factor and shade approprate to interperolate
>  the actual line.  Although the actual hardware resolution doesn't
>  change, the eye perceives a smoother and more "real life" line rather
>  than a  pixelated image.
>
>  In 3d this is generally accomplished by calculating the radiance factor
>  given the relative angles of each facet contributing to the edge.  A
>  similar effect is used to smooth the compound curves for pixelated 3d
>  images.
>
>  There is no reason this could not be used with coarser displays (reduced
>  pixel counts).  But today, with small screens on PDA's and other
>  devices, it won't be long before head mounted displays can be megapixel,
>  full color and quite small.
>
>         One use is via a DMD IC (produced by TI) mounted in the frame of
>  eye-wear that will project an image onto the inside of the lens (first
>  surface).  This surface could be coated to reduce reflective
>  interference with the real world, and it would be possible to overlay VR
>  onto the real world.  Think about a home walk through where your
>  furniture could be featured in the rooms as you walk through.  Clothing
>  that could be shown on your body via a VR mirror, or other commercial
>  applications.
>
>         A means of achieving polar coordinates appropriate to the room could be
>  a light bar on some wall, viewed by a camera in the headset, similar to
>  the WII.
>
>         How's that for a good application?
>
>  Regards,
>
> Les H
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Impressive ... brain control headset ... Johnny Lee wii head tracker

Paul Sheldon-2
In reply to this post by Howard Stearns-3
Thanks.

--- Howard Stearns <[hidden email]> wrote:

> Very nice low budget do-it-yourself head tracking by
> Johnny Lee at  
> CMU: http://youtube.com/watch?v=Jd3-eiid-Uw
>
I observed the utube by searching Johnny Lee on iPhone
and seeing
it matched video of your link from a computer.

He put together a wii and a baseball cap but
illustrated
his pov with a video camera, impressive.

One would see the static images with a background of
pixels
scanning through them as I moved my head.

I have e mailed the emotiv guys who aren't clear I
have the platform
for their free sdk in an e mail worrying me about
legaleze should
I download. Someone said that another downloaded game
sdk
crippled your chance to profit from making games
by demanding a cut. I sped read their legaleze and
didn't see
something so scary but maybe I didn't read carefully
enough.

I hope they bother to write back and reassure me.
Reply | Threaded
Open this post in threaded view
|

Re: Impressive ... brain control headset ... Johnny Lee wii head tracker

Paul Sheldon-2
In reply to this post by Howard Stearns-3
Thanks.

--- Howard Stearns <[hidden email]> wrote:

> Very nice low budget do-it-yourself head tracking by
> Johnny Lee at  
> CMU: http://youtube.com/watch?v=Jd3-eiid-Uw
>
I observed the utube by searching Johnny Lee on iPhone
and seeing
it matched video of your link from a computer.

He put together a wii and a baseball cap but
illustrated
his pov with a video camera, impressive.

One would see the static images with a background of
pixels
scanning through them as I moved my head.

I have e mailed the emotiv guys who aren't clear I
have the platform
for their free sdk in an e mail worrying me about
legaleze should
I download. Someone said that another downloaded game
sdk
crippled your chance to profit from making games
by demanding a cut. I sped read their legaleze and
didn't see
something so scary but maybe I didn't read carefully
enough.

I hope they bother to write back and reassure me.
Reply | Threaded
Open this post in threaded view
|

Re: Impressive, soon-to-come brain control headset

Gert Gast
In reply to this post by Darius Clarke
Hi,

Just tuning in. I tried to install fix8, it wont launch. Any ideas? (got port 8000 open, XP service pack2 etc etc), it registered online, but wont launch from the desktop icon..

Any ideas?

Cheers, Gert

On Thu, Feb 21, 2008 at 8:47 AM, Darius Clarke <[hidden email]> wrote:
I think fix8 might be better for now. Only needs a web cam.

http://fix8.com/



--
Dr Gert Gast
Senior Lecturer, SAE Institute Byron Bay

SAE Institute World Headquarters
373-391 Ewingsdale Road
Byron Bay NSW 2481 Australia

+61 (0)2 6639 6000 | Phone
+61 (0)2 6685 6133 | Fax

www.sae.edu | Web
[hidden email] | Mail


National Provider Code: 0273
CRICOS Provider Code: NSW 00312F

CONFIDENTIALITY CAUTION : This message is intended only for the use of
the individual or entity to whom it is addressed and contains
information that is privileged and confidential. If you, the reader of
this message, are not the intended recipient, you should not
disseminate, distribute or copy this communication. If you have
received this communication in error, please notify us immediately by
return email and delete the original message. Thank you.
____________________________
Reply | Threaded
Open this post in threaded view
|

Re: Eyetracking VS BCI (eyetracking could make hi res HMD)

Ric Moore
In reply to this post by Les Howell
This is from the guy, Les, waiting in the background to give me a hand
to help develop OAR. US Navy Computer training since the '60's. So, when
you say that you have a guy, I say I have THIS guy and several more just
like him on tap, all wondering when Ric is finally gonna get rolling.
Your territory manager does his job.

Now with an opportunity to use Qwaq, I have some good choices now of
what to use. As soon as you can come up, I have a box of fairly new
motherboards for you to take to Tim and see if he can jam the best one
into a decent case with WinXT jammed into it. I already have plenty of
keyboards and mice. We should get off -real- cheap. Less than $100.

If I'm forced to, I'll run Multiverse client on Windows. <sigh> You know
I'm bending over backwards here. But, the server will still be a linux
machine so we'll have no network licenses to pay for! Just the base cost
of WinXT per machine. I gotta consider all the bases, especially with
the Moo-Lah. :) Ric



On Thu, 2008-02-21 at 11:31 -0800, Les wrote:

> On Thu, 2008-02-21 at 09:50 -0800, PAUL SHELDON wrote:
> > --- "deadgenome -.,.-*`*-.,.-*`*-"
> > <[hidden email]> wrote:
> >
> > > you move the projection source to physically match
> > > the direction your
> > > eye is looking...
> > Yes that is what I meant to try to convey or instruct.
> > > off the top of my head I can think
> > > of several
> > > arrangements of optics that could accomplish this.
> > >
> > Fantastic. I think you understand the idea I was
> > trying to convey.
> > The eye would track the image that was behind the
> > sampling screen
> > and the pixels would interleave precluding the need
> > for high resolution
> > sampling screen by time multiplexing of interleaved
> > pixels on retina
> > if you want to think about retinas or more difficult
> > stuff
> > if you want to think of vision systems in the brain
> > like Bell Labs did.
<This is Les below, re: 3D optic displays either directly in front of or
into the eyes and tracking the eye to either move or rotate to where you
look. HEAVY! I love it!>

> Yet another option is shading to move the perspective from one pixel to
> the next.  Think of a black line running at some arbitrary angle.  When
> the line is at precisely 45 degrees (or whatever is appropriate for the
> ratio of length to height of pixels), the line is directly drawn.  If
> the line runs at some other angle, the pixels on either side are shaded
> approprately to the gamma factor and shade approprate to interperolate
> the actual line.  Although the actual hardware resolution doesn't
> change, the eye perceives a smoother and more "real life" line rather
> than a  pixelated image.
>
> In 3d this is generally accomplished by calculating the radiance factor
> given the relative angles of each facet contributing to the edge.  A
> similar effect is used to smooth the compound curves for pixelated 3d
> images.
>
> There is no reason this could not be used with coarser displays (reduced
> pixel counts).  But today, with small screens on PDA's and other
> devices, it won't be long before head mounted displays can be megapixel,
> full color and quite small.  
>
> One use is via a DMD IC (produced by TI) mounted in the frame of
> eye-wear that will project an image onto the inside of the lens (first
> surface).  This surface could be coated to reduce reflective
> interference with the real world, and it would be possible to overlay VR
> onto the real world.  Think about a home walk through where your
> furniture could be featured in the rooms as you walk through.  Clothing
> that could be shown on your body via a VR mirror, or other commercial
> applications.
>
> A means of achieving polar coordinates appropriate to the room could be
> a light bar on some wall, viewed by a camera in the headset, similar to
> the WII.
>
> How's that for a good application?
>
> Regards,
> Les H
>
--
================================================
My father, Victor Moore (Vic) used to say:
"There are two Great Sins in the world...
..the Sin of Ignorance, and the Sin of Stupidity.
Only the former may be overcome." R.I.P. Dad.
Linux user# 44256 Sign up at: http://counter.li.org/
http://www.sourceforge.net/projects/oar
http://www.wayward4now.net  <---down4now too
================================================