LOS ANGELES -- A head-mounted device projects a high-resolution image directly on the retina. A robot is controlled by the brain-stem cells of a fish. A computer emits odor and aroma. A laptop translates spoken words into another language.
Each of these newly developed "silicon senses" could dramatically change the way we perceive our worlds: for good, if they find their way into medicine, education or art; for bad, if they become crutches that reduce our ability to imagine or create.
Researchers presenting at Siggraph's "Sensapalooza" session last week acknowledged both possibilities.
"We don't want virtual reality to take the place of the real world," said Thomas Furness, director of the University of Washington's Human Interface Technology Laboratory and a creator of a "virtual retinal display" that will go on the market early next year. "We want people to experience things they never could before."
The device, licensed to Washington's Microvision, takes a radically different approach from other head-mounted displays, such as Sony's Glasstron. Because the light is focused on the retina, rather than a screen, it uses less electricity –- from one to three milliwatts, depending on the resolution of the image. This means the device could potentially be miniaturized in the years ahead to spectacle-size proportions.
Furness says the display is already being used in trials by neurosurgeons, who get detailed anatomical information projected directly into their field of vision during operations. And some largely blind individuals fitted with the devices have been able to see simple shapes and primary colors.
Henry Lustiger-Thaler of Aerome USA demonstrated what was surely the world's first multimedia presentation -- of the wine-making process in France -- to include the smells of crushed leaves, grapes and burning wood.
A PC was fitted with a small jet that picked up the smell of a "scent" cartridge and wafted it into the nostrils of the person pressing the computer's touch screen.
"We can do it in sync with music," Lustiger-Thaler said. "I can send a different scent every two seconds."
Besides its obvious applications for marketing, Lustiger-Thaler said the product has attracted the notice of pharmaceutical and cosmetics manufacturers.
Sandro Musso-Ivaldi of Northwestern University showed videos of his experiments with lampreys, fish whose nervous systems have certain similarities to those of humans. By sending electrical impulses from a small robotic device to fish brain stems in a salt solution, he was able to crudely simulate how the brain controls motion.
Theoretically, such equipment could help paraplegics regain movement. But Musso-Ivaldi warned that practical applications for his research aren't likely to happen soon. "It will probably be beyond my lifetime," he said.
Alex Waibel of Carnegie Mellon University's Interactive Systems Laboratories, by contrast, is actively seeking partners for his software creation, which takes speech recognition to the next level.
Most existing systems can only be trained to recognize individual words for dictation purposes. Waibel has developed software programs that enable computers to understand the context of words, by searching through their sentences for keywords that provide a clue to their meaning.
Waibel is particularly interested in travel applications. Tourists to Tokyo, he explains, might be able to use undoctored speech to ask their computers important questions such as "Where's a hotel?" and "Where can I buy beer?" (He also works at a German university.)
If the computer doesn't know the answer, the program will helpfully translate the question into another language that is understandable by somebody who knows the answer.