Watch this demo — done by Sinisa Kolaric, PUC-Rio — of an early prototype of a human-computer interface for spatial manipulation of 3D objects, using bare (i.e. unmarked and uninstrumented) hands.
Hand positions, as well as hand gestures, are recognized and tracked by a system consisting of an inexpensive stereo pair of overhead web-cameras, and of software based on the Viola-Jones detection method and KLT feature tracking.