The Full-Body Interface

You move, an on-screen object moves with you. It's not a VR body suit, but user-input via video camera.

Bringing the video camera into the human-computer interface chain, a new PC technology lets a user's body movement initiate on-screen events the way a mouse or joystick does today.

At the industry conference Demo 98, Reality Fusion introduced "FreeAction" Tuesday, a technology that provides an alternative method for user input on the PC. With a video camera mounted on their PC, users interact with a computer application by moving their hands or body.

FreeAction will track a user's movements for interaction with objects on-screen - whether it's an application menu, a game ball, or an enemy opponent. Incorporating the technology into games and application interfaces, software developers could replace keyboard, joystick, or mouse input with the user's body motions. It will be initially be targeted at the PC game developers.

In handling the event path from the real to the virtual world, a "Known-Object Reactor" recognizes and tracks known objects in the video image and assigns them special properties on a computer.

FreeAction-enabled software would recognize a sword in a user's hand, for example, perhaps substitute it with a computer-animated one, and match the gamer's slashes and stabs to the action on the screen.

A separate component is responsible for interaction between the user's digital representation and elements of a computer interface, such as an on-screen slider control which the user could adjust with a swipe of a finger.

Another component of FreeAction separates foreground and background images in the incoming video signal, in part allowing a user's image to be immersed in a still picture or video clip.

The company has provided a set of screensavers on its site to demonstrate the technology. For developers, FreeAction comes as a set of dynamic link libraries providing an API to the technology's proprietary algorithms.