Imagine the ability to create an iPad on any wall or surface you come across, even on a piece of paper, or the chance to control computers and other physical machines with your brain waves. This is the future of human computer interaction, according to innovative researchers and entrepreneurs who took the main stage at VMworld.
SixthSense, a "wearable gestural interface" created by Pranav Mistry of the MIT Media Lab's Fluid Interfaces Group, outfits humans with a small projector, mirror and camera worn around the neck (or on a helmet), and little colored markers worn on the fingers.
The prototype lets the user project a computer onto a wall, to check email and browse the web similarly to how an Apple iPad works, except that the gestures can be made in the air without touching a screen.
Mistry's video demonstration of SixthSense at VMworld showed him using the technology to take pictures with his hands, project a phone dial pad onto his palm, augment newspapers with footage of President Obama speaking, play a racing game on a piece of paper, get digital updates on a flight projected onto a plane ticket and play Pong on the Boston subway using his feet as paddles.
You can even "copy and paste" text from paper books, placing it in your personal computer screen, which happens to exist anywhere you want it to.
Mistry believes current devices are too limiting, and that people should be able to interact with the information normally locked inside computers and the Internet using the normal human gestures of daily life.
"We as humans are not interested in computers. Our interest is in information," he said. "There's no need for us to have two separate worlds" that separate the digital from the physical.
Image credit: Sam Ogden