This article was taken from the December issue of Wired magazine. Be the first to read Wired's articles in print before they're posted online, and get your hands on loads of additional content by subscribing onlineTan Le dreamed of operating machines using only thought. Now her £200 Epoc headset is redefining brain-computing interaction The piles of monitors in the front room at the offices of Emotiv Systems in downtown San Francisco resemble an electronics shop that's holding a going-out-of-business sale. A back room, which contains workstations, is more organised, but not exactly a hive of activity: even in the middle of the day, there are only a few staff fiddling with hardware. It doesn't look like the headquarters of a company that sits on the verge of changing the way people interact with computers.
The conference room is where Emotiv's magnetically intelligent CEO Tan Le holds court. Here, she demonstrates to visitors Emotiv's innovative product: the Epoc, a form-fitting, 14-sensor electroencephalography (EEG) headset that lets people control their computers without touching a key. Though it's not the only such product on the market, the Epoc has become the device that's garnering the most praise and attention at tech conferences and laboratories across the world. But, while demonstrating it, Tan, 32, is always sure to make the point that this isn't the final version. "I don't want this technology to be a fad and then go away," she says. "The idea of having a brain-computer [interface] isn't a new one. People have wanted to do it for a long time. We want to make sure that the experience you get, with whatever it is you're trying to do with it, is as good as possible. It's going to take time.
It's not going to be quick." For years now, numerous tech companies have been involved in a race to bring to market portable devices that read brainwaves in a way similar to how hospital EEG machines work, but at a much lower cost. Inexpensive gaming devices began appearing last Christmas. Mattel debuted its $80
Mindflex game last year to enthusiastic response.
NeuroSky is helping to usher in the next generation of gaming technology using a single "dry sensor" placed on the player's forehead.
Meanwhile, the drive is on to create more efficient -- yet still complex -- EEG headsets. Design remains a challenge. One device, tested this year at the Federal Institute of Technology in Lausanne in Switzerland for a mind operated wheelchair, makes the user look like a character from a community-theatre production of Woody Allen's Sleeper.
Only Emotiv has decided to open-source its product. Last year, rather than make the headset available to consumers, the company made the business decision to market it to developers and researchers instead. That way, says Tan, when Emotiv finally does launch its public campaign, it will have a thousand applications available rather than just two or three. When asked how long that will take, she says: "Twelve to 18 months."
Currently, Emotiv has shipped 10,000 Epoc headsets. A development team in Russia has been creating software that allows users to search online based on visual recall of images. The research arm of the US Defence Department is funding grants to test mind-controlled prosthetics and something it calls "brainwave binoculars". In addition to the mind-operated wheelchairs, there are robot arms and countless games in the works -- nearly as many applications of brain control as, well, anyone's mind can imagine.
For data-gatherers, Epoc offers new opportunities to study brain response -- whether for market research or to understand schizophrenia. Matthew Oyer, a tech hobbyist in Princeton, New Jersey, designed a special cap to measure his dog's brainwaves, inspired, he says, by the talking dog in the Pixar movie Up. In Australia, performance artist Karen Casey adopted the headset for a project that involved "realtime interactivity for EEG-generated video art". She'd already developed her own software that would allow performers to play a keyboard remotely, or, in one case, "examine the cyborg being through the adoption of a virtual-reality persona. The Emotiv Epoc headsets were definitely the technology we were waiting for," she says.
Early feedback is extremely positive. Robert Oschler, an independent software developer in Florida, was one of the Epoc's early adopters, more out of curiosity than an actual belief that it would work. "I was sceptical," he says. He wrote a program to control a Rovio robot via a Skype connection and, much to his surprise, it worked beautifully. "The first time I set up my robots and they followed me, like it was telerobotics, it blew my mind. I had a strong emotional reaction. I realised that all the stuff that sounded like marketing was actually real."He was hooked. Oschler began working on an application that could gauge real-time emotions. Within a few weeks, he had, in his own words, "put the chocolate and the peanut butter together" and uploaded a YouTube demo.
While watching a trailer for a cartoon, Oschler tracks four basic emotions: happiness, sadness, fear and excitement. Then, when the video ends, he recalls those emotions, and the trailer automatically rewinds to the moment when he felt that emotion the most strongly. The Epoc, he says, "opens up a whole range of interactions with computers that just wouldn't be possible otherwise."
In another research example, a series of four videos shows a young woman, "Cora" (not her real name), staring at a monitor. A car accident had left her paralysed, with no control over her limbs, or even her neck. Only her facial muscles work. She wears the Epoc on her head. In the videos, Cora works with a therapist while playing a computer game called Spirit Mountain, which comes with the device. In a quasi-mystical dojo-type setting, the player is supposed to work with a "master" who will "train" her brain to interact with the computer. After training, if the user thinks of an action such as "lift", the program is supposed to respond. But as the video progresses and Cora trains, she seems disengaged at best, essentially bored. The video ends with the viewer just as bored, wondering what the fuss is about.
The second video, made shortly after, shows Cora playing the game fully involved, but clearly struggling to work the Epoc in conjunction with her mind. When she finally gets the headset to transmit an intention, her face lights up with sheer delight. A third video shows her interacting fully, holding commands for 30 seconds or more and playing the game expertly. By the fourth and final video, she's not only controlling the game but also holding her head upright for the first time in a decade. When the team at Emotiv saw this, they knew that they were on to something. "I think the world is going to get to a point where everything is run on remote control based on biosignals from an individual," Tan says, with the confident ambition that's become her public hallmark. "It won't be something peripheral, outside your body that you have to tell what to do. That's not adequate, because our world is exploding in terms of information and content, and the information and content are changing radically."
Other researchers have been more critical of the device. A team at the University of Massachusetts Dartmouth used the Epoc to help to develop an early version of the "NeuroPhone", which can, among other things, dial a number based on the user merely seeing a picture of the person they want to call. It functioned adequately, says Andrew Campbell, a professor who worked on it. But, he warns, the Epoc is still in its infancy and best confined to the laboratory. "When people use Epoc headsets, they're sitting down in front of the computer," he says. "It works beautifully in idealised conditions. But if you get it out in the world, it's more problematic."
Tan knows full well that the Epoc hasn't yet reached its full potential. All day and often deep into the night, she fields calls, keeps an eye on her production facilities in the Philippines and reviews sketches from videogame designers. Her daily schedule includes taking calls from major corporations, the US military and small software developers with obscure technical questions. She tests the Epoc constantly, fully aware that her product is being used in ways that she could have never imagined. "Emotiv cannot usher in a revolution by itself," she says. "But we can certainly create a platform that can allow it to happen."
One morning late in 1981, a British oil tanker rescued an American tugboat that was in danger of sinking off the coast of Malaysia. On board were 160 Vietnamese refugees, a four-year-old girl named Tan Le among them.
Her father's family had money, but her mother's family had criticised the government. In Vietnam at that time the consequence of speaking out was arrest and confiscation of one's property.
After losing their home, Tan's family, except for her father, fled. In December 1981, Australia accepted her family as refugees.
She moved to Melbourne with her grandmother, aunt, uncle, mother and sister. She didn't see her father for 18 years. By then, she was travelling to Asia as an Australian goodwill ambassador, and her father had read about her in the newspapers. Deprived of money and status, Tan's family had arrived with nothing. Her mother worked on a farm and then got a job at a factory, learning English from notes posted around the assembly area where she worked.
Eventually, she earned two university degrees, started a business and was elected mayor of Maribyrnong, the town near Melbourne where the family settled. Tan inherited her drive.
A little over a decade after arriving in Australia, Tan, now a secondary-school pupil, joined an organisation that helped immigrant families get jobs, but was appalled that, among other things, assistance was only in English. She became chair and launched a series of initiatives, including the introduction of more languages. In1998, the government recognised her work, naming her Young Australian of the Year. That's where the story of Epoc begins.
The Young Australian Prize is a prestigious honour whose other recipients have included tennis player Lleyton Hewitt and the concert pianist Rebecca Chambers. By the time Tan won it, she was already, at 20, a university graduate with degrees in both law and commerce. She soon became a national celebrity and was in demand to speak at conferences around the country.
At one of these appearances, the Murdoch sponsored Australia Unlimited Roundtable, she made a contact that would change the direction of her life. The enthusiastic, working-class Asian girl from Melbourne didn't feel like she fitted in with the other delegates, who included captains of industry and the prime minister.
Yet she wasn't the only incongruous guest. Also there was an American physicist named Allan Snyder, who'd won the Australia Prize (the senior prize to the one awarded to Tan) in 1997 for creativity and scientific invention. He had also recently won the coveted Marconi Prize for his work on fibre optics. Snyder attended the conference wearing a baseball cap tilted to the side. And he and Tan, two oddballs in a sea of grey-suited high rollers, found each other. It was the first of two meetings that would eventually lead to the development of the Epoc. "We bonded, "Tan says drily of her first meeting with Snyder.
Specifically, they bonded over the physicist's ideas about human genius. He'd recently become the head of the Centre For The Mind at the University of Sydney -- which, according to its mission statement, "focuses on scientific ways to enhance creativity and to instil the 'champion' mindset" -- and was starting to put his theories into practice by working with autistic children. "His theory," Tan says, "is that [genius] is inside all our brains. Our education makes our cognitive processes overshadow these innate abilities." Snyder found an enthusiastic and intelligent student in Tan. By the time the conference ended, he'd invited her to become an advisory member.
Soon after graduating from university, Tan had taken a job at a high-powered Melbourne law firm. Yet her Young Australian of the Year travels had, she says, "opened my eyes to a new world of entrepreneurship, technology, science, innovation, sports, arts and academia -- but what resonated most was the realisation that science and technology were driving the change in our world." She felt that a career in law, while it could be interesting and rewarding, would also leave her on the sidelines. "If I stayed there," she says, "I would be only a facilitator and not an innovator and creator. I felt I could do more."
When speaking at Melbourne University in 1999, she had the second key meeting that would lead to the birth of Epoc. It was with a young technology whizz named Nam Do, who, even as a nine-year-old in Vietnam, had been recognised as gifted. He went on to win a scholarship to the Royal Melbourne Institute of Technology in 1986, aged 19. Nam pegged Tan instantly as a professional soul mate. The digital revolution was in full flower, he told her. He said that, if she didn't join him in business, she'd essentially be missing the future. He was so persuasive that Tan quit her job. "What I liked about Nam was his vision, enthusiasm and ambition," she says. "He was also plugged into a world that was unfamiliar to me -- the world of mathematicians, physicists, scientists, engineers, technologists. I was more familiar with the world of business and systems. It was very complementary."
Nam's friends, Tan says, "were all whizzbang super-smart engineers from Vietnam. If you wanted to create something really crazy, these guys could do it." Over two years, she and Nam formed a team and started a company making telecoms software. The business was profitable -- they lived comfortably and were able to help out their extended families -- but they couldn't exactly say they'd realised their life's ambitions. Nam was prone to bursts of inspiration and Tan was systematic, methodical, almost a natural CEO. They knew they could do more. By the summer of 2003, they'd sold the firm. Tan was 26, wealthy and looking around for her next project.
Around the same time, Allan Snyder had been experiencing breakthroughs in his research into the workings of the brain. He'd developed a "thinking cap" that used transcranial magnetic stimulation to temporarily shut down the left hemisphere of the brain, where speech and short-term memory are supported.
People who exhibit savant like abilities can exhibit damage in this region of the brain -- Snyder's goal was to simulate reversible "damage" in this area. Despite enormous scepticism among some neuroscientists, Snyder's experiment seemed to work. Tan saw him on CNN and sent him a congratulatory email, saying they should get together.
One night Tan, Nam and Snyder met for dinner. The two entrepreneurs started talking about their company. Snyder talked about the human brain. Gradually, the two topics began to merge.
The three discussed the differences in the way people communicate with machines and with other humans. Communication with machines is very specific and directed, Snyder said. We're always telling machines what to do.
Whereas when we communicate with each other, it's a lot more complex. It involves facial expressions, body language and other intangible, non-conscious interactions. Some people would run from a table occupied by a frizzy haired scientist who claims that it's possible for human beings to connect emotionally with computers, but Tan and Nam were fascinated by what Snyder had to say. "Our idea over dinner was: how can we evolve the next generation of human computer interaction so it becomes far smarter, so that it actually understands not just what you tell it to do, but also how you're feeling, how you're responding to things, so AI becomes more intelligent?"Tan says.
Digital-signal processors (DSPs) -- fast, powerful microprocessors that work in real time -- are the workhorses of computing. Wouldn't it be amazing, Tan and the others thought, if they could create a device implanted with a chip that could read emotional-signal processing? It would be embedded in every computing platform so computers could "sense" human emotions.
The eventual goal, Snyder said, would be to breathe a new dimension into computers so they could even read unconscious thought. "It was a very long dinner," Tan says. Eight and a half hours later, at 3.30am, the group rose from the table completely invigorated. "You have a lot of great ideas and epiphanies, and then they die," Tan says. "This was different. When we spoke that night, we got beyond the ideas. We started to talk about how it would be implemented. It evolved into something more real."
The more Tan and Nam thought about the idea, the more excited they became. They did some research and concluded that developing something like this actually might be plausible -- or at least worth trying. If they succeeded, Tan says, "it would be world-shattering." Two months later, they met Snyder again. He, too, had been thinking about that night, and had decided that these two unknown tech entrepreneurs, with no experience whatsoever in brain science, would be the right people for the work. "They were enormously enthusiastic about pouring their entire beings into making my idea a reality," Snyder says. "They had what I call the
'champion mindset'."
Next, Tan and Nam approached Neil Weste, a microchip developer who'd founded a company called Radiata and then sold it to Cisco Systems for nearly A $500 million (£300 million).Weste was interested, particularly when he heard that Snyder was on board. A few weeks later, Weste emailed Tan and Nam, and asked them to meet him in Sydney. "He said: 'I bought my house and my farm, my daughters have their own houses, I bought my boats and blah blah... and now I'm bored,'" Tan says. "'Can I throw some money in to help you get started?' I said, 'Yeah -- of course.'" The group was complete.
In December 2003, the four officially became partners in Emotiv Systems. A sci-fi fantasia dreamt up at the dinner table had become tangible. But there was no celebration. They had lunch in Snyder's office, and quickly after that the planning began. "It was so natural, everything was perfect, "Tan says. "But then we started the hard grind."
That grind had to begin with data, and Tan and Nam needed to pick a method. fMRI technology, which measures patterns of brain activity via the magnetic properties of oxygenated blood, provided great readings, but the average machine cost $1 million and required a trained operator -- hardly practical for home users.
Positron emission tomography, which involved repeated injections of radioactive glucose into the brain, was also rejected, for obvious reasons. The best bet, Tan and Nam decided, would be to record electrical brain activity using EEG; in other words, recording the electrical activity on the scalp produced by the firing of neurons in the brain.
But EEG data presented a large problem: it can be used to measure only surface brainwave activity. Humans have evolved in such a way that we have significantly more mental capacity than other life forms, but don't necessarily have larger brains.
Instead, our brain has folds that increase its available surface area. This folding, though, has an annoying side effect: by the time electrical signals from deep inside the brain have travelled to the surface -- and on to the scalp -- they appear random. "We thought: chaotic system, huge signal- to-noise ratio problem, well, that's just great," Tan says. They had to create an algorithm that could identify individual cognitive and emotional states from brain data, in real time. They met many EEG experts who told them that what they were attempting couldn't be done. So they hired people who told them it could.
EEG read-outs are, even to specialists, fairly opaque pieces of data, and in the end Tan and Nam had to hire a surprisingly diverse team -- including financial statisticians, cancer researchers, evolutionary biologists, mathematicians, engineers, cognitive psychologists and dozens of graduate students willing to work on the cheap -- and to meet the challenge of creating an algorithm that would, in effect, unfold the human cortex. They faced a huge challenge. "Nothing existed for us," Tan says. "Every thing had to be created from the ground up."Most of their team moved up to Sydney from Melbourne. They rented a flat, and the work started there -- in the living room. To begin with, they decided to try to detect excitement in the brain. Using hundreds of volunteers, they measured EEG levels alongside pulse rates, skin-conductor ratios and a number of other factors. The team then had to process the data. "It used to take up the whole weekend with a computer bank to crunch ten seconds' worth of data," Tan says. "The team would write an algorithm, then we'd go to the bank of computers. We'd input the data, wait all weekend and come back Monday morning."
After their crew had logged more than 25,000 research hours, Tan and Nam saw something in the data that looked promising. One night in late 2004, they informed the team that they'd all be pulling an all-nighter to build a real-time model. They chose an intern called Anang as the guinea pig. He was a strict Brahmin who practised meditation every day and kept a rigidly vegetarian diet. "We figured he could modulate his emotions more easily than most other people, "Tan says. They applied the EEG electrodes and began to ask him embarrassing questions. "We saw peaks and troughs, and we thought, straightaway, 'Oh my God, this is working!'" Tan says. "It was really, really happening."They let Anang calm down, and then tried it again, several times over. Each time, they got nearly identical readings. The algorithm could actually tell what a person was experiencing and feeling in real time. They had a system. "Gosh,"
Tan says, "that day was so nice." They celebrated in the usual way: a breakfast of coffee and muffins, and then back to work. Emotiv wasn't known for its partying.
After the breakthrough, Tan and Nam undertook a financing round.
They took the company global and Tan moved to San Francisco so she could be closer to the financial levers of the tech industry. They hired an Australian laser physicist called Geoff Mackellar to head the Sydney research team, opened offices in the Australian Technology Park so they could have contact with other early-stage tech companies, and set up a production line for the headsets in the Philippines. Nam returned to Hanoi to head a team of software developers. Most of the work now involved discovering what Tan calls "other detections".
They developed three distinct algorithmic "suites". The first, which the company came to call Expressiv, involved reading facial expressions in which an onscreen avatar recognised a smile, or gritted teeth, and reacted accordingly. "[Expressiv] allows your expressions in a virtual environment to become far more intuitive, far more natural and personal, and ultimately more lifelike," says Tan. The second, Affectiv, stemmed from the original research and could measure "emotional experience", such as happiness, sadness and fear. Finally, Cognitiv, the third suite, was directed at "mankind's longest and oldest fantasies" and allowed subjects to control objects just by thinking about them.
Within two years of its initial breakthrough, Emotiv had logged algorithms for about 30 "detections" and had produced an early version of the Epoc headset -- derived from "epoch", the scientific name for a standard interval of time in data collection. Tan and Nam's mentor was pleased. "Nam and Tan performed beyond my wildest expectations," says Allan Snyder. "Dreams always outreach realities if they have any potency whatsoever."
Now they were ready to begin marketing the product. The Epoc headset began appearing at game-developer conferences and trade shows. Users seemed amazed by the strange new technology and often had unreasonable expectations of the headset. At one conference, someone asked Tan if the Epoc would allow its user to move objects in real space, like a chair across the room. She had to explain that it didn't confer telekinetic powers. It couldn't even read the users' thoughts. Instead, as a Harvard Business School study of the Emotiv system said, "it can detect consistencies in brain activity as a person thinks about the same thing time after time."
The technology garnered a lot of corporate interest, but Tan and Nam had trouble finding the right fit. Nintendo wanted the headset just as a complement to its Wii system. Microsoft, Nam says, didn't feel the concept was far enough along: "They want others to do proof of concepts for them. If any concept pans out, then they can come in and do it bigger and better; after all, they are Microsoft."
The most logical partner, Sony, seemed to want a simpler version of the technology. After long negotiations, Tan and Nam rejected the partnership. "Dumbing down our technology just to get the Sony name is risky," Tan says. "This is our first product ever, our coming out to the world. If we launch something that is perceived to be too simple or not exciting, it could taint the perception of the entire category of brain-computer interfaces as capable of delivering an awesome experience. Imagine if Apple had launched with the Shuffle, and not with the iPod."
Emotiv shipped the first Epoc headsets on December 22, 2009. Nam and Tan made the decision not to market it as a consumer device, which might have led it to be seen simply as a gaming system. The sample bundled "game", Spirit Mountain, worked well but was really more a training module. Another package, Cortex Arcade, had been designed in the early days and had game play limitations. As the scifi novelist Rick Dakan wrote in a review on gaming site Joystiq: "Playing slow-motion game of Pong that's harder and more frustrating than Ninja Gaiden on its toughest setting is not a happy experience."
Instead, Emotiv sold the Epoc as a kind of open-ended research device. For $299, a consumer could get the headset with its basic games and training modules. But for $500, they could license the headset with the purpose of developing software for it.
Applications would then be sold on the Emotiv website, in the Apple App Store model, with the sales split 70/30 between the developer and the company. A $750 "research" package came with more raw EEG data for the potential developer. For $2,500, corporations could get the headset, the data and the right to bundle the software independently with other products they were developing.
Within a few months, Emotiv had 10,000 clients, ranging from home tinkerers to Boeing, car companies and a billion-dollar perfumery concern. Its potential applications seemed limited only by what users' brains could imagine. "They're not all going to pan out," Tan says. "Some people will do the research, and it might be too early for them in terms of what the technology can do today.
Others will say we want to use it immediately and have implementation now. You have to nail it with the developers and researchers. Even if we were 100 people, we couldn't develop the plethora of applications that a broad reaching research community could. It's much better for application developers to develop in their field."
Tan is now a globe-travelling tech visionary, speaking at the TEDGlobal conference and charming audiences with her drive and good humour. "It takes an incredible amount of dedication to get a product to market, so you have to love what you do," she says. "The idea of having a brain-computer interface that's compelling and will let you do so many things will become real."
Her partner Nam Do puts it more prosaically. "Everybody knows that one day the world will be like this," he says. "We just want to get there faster."
This article was originally published by WIRED UK