[News - Via metaversed] Xtreme reality's webcam-based human machine interface
Just wanted to post this news, which shows advances in the motion
A question popped up in my mind : in your opinion, when these motion
tracking advances will go widely, will it be at system-level (thus
simulating key/button presses and (multi)pointer displacement) or
directly integrated into applications (via API, needing to modify the
software) ? A hybrid approach (system-level input emulation +
application-specific mapping profiling) could be interesting too...
"...you point a generic webcam at the place where you normally sit..."
This is astonishing because the motion capture lab I went to
had very expensive cameras all over the wall.
Yeah, they said two were enough, but the software they used used all of
Perhaps the one camera uses ranging information to get the 3D
rather than some sort of fast image understanding expert calibration system
that I might imagine from conversations offline I surfed through.
Sorry, last night, I didn't dare search my personal database of croquet
including private e mails because I was exhausted and needed sleep
(dental work studying iPhone argentine tango movies, a grueling argentine
and implementing Penrose diagrams in Maple will do that to you---extreme
I find negative (depressed) thinking comes with such "core fatigue",
so I let go of persuing research then.