If Seth Lloyd's right, someday we'll have "quantum computers" 100 million times more powerful than today's Pentium-based PC.
Seth Lloyd is eating a grilled-cheese sandwich in the Santa Monica apartment where he lives with his wife, his cat, an assortment of musical instruments, and a lot of books.
He picks up the salt shaker. "You know," he says in an offhand way, as if he's going to make a comment about politics or baseball, "one grain of salt probably has about a billion billion atoms in it."
He leans forward, and his expression becomes more intent. "Suppose we can find a way for each atom to store one bit of information. In that case, a single grain of salt could contain as much information as all the RAM in all the computers in the world."
Is that right? Lloyd sets down the salt shaker, reaches for a yellow legal pad, and starts writing numbers. "Let's say there are 500 million computers in the world, from laptops to mainframes, with an average of 10 megs of RAM - yes." He smiles with satisfaction. "Yes, that's right. A billion billion bits of memory."
He goes back to his cheese sandwich.
Seth Lloyd has a Boston accent and a slightly pedantic way of talking, but he's friendly and informal, a slim, long-haired guy of 34 with an easy smile and a ready sense of humor. During the past four years, first at the Santa Fe Institute and subsequently at MIT, he's made crucial advances showing how the bizarre, fledgling science of quantum computation may be implemented in the real world. (See "Why 'Quantum'?" page 166.) Even skeptics admit that Lloyd's work has brought us a step closer to the limits of size and speed in computers.
His apartment, a mile or so from Venice Beach, is slightly bohemian but very civilized, a relaxing place to be. It's fun to sit here and debate the number of bytes that can dance on the head of a pin. But beyond this playfulness lies the awesome challenge of designing the smallest, fastest data-processing devices that the laws of physics will permit. If Lloyd's hypothetical model can be built, the world will have computers that could be 100 million times as powerful as a Pentium-based PC.
Lloyd has an easygoing manner, but he's had some uneasy experiences in large, conventional academic institutions. He started out studying high-energy physics at Harvard, where he recalls working on three separate experiments that won Nobel prizes. But, he says dismissively, "I was just making coffee, sweeping the floor. Mostly I remember doing silly things like seeing who could hold his hand the longest in a Dewar of liquid nitrogen. Another thing we used to do: if you have a particle accelerator with an attenuated beam, you can stick your head in it and see blue flashes caused by Cherenkov radiation. The particles are moving faster than the normal speed of light in the eye, so they make a kind of visual sonic boom." He sighs. "The need to do these things shows how dull science can be."
Shortly after Harvard, he participated at CERN, also known as the European Particle Physics Laboratory, in Geneva, in a massive effort to discover a minuscule subatomic particle. "There were 200 physicists and 500 technicians," he recalls. "It made me want to go off on my own, to pan in streams for nuggets."
At Cambridge University in England, where he got his master's in mathematics and philosophy of science, he says, "I enjoyed the work, the conversation, and the beer, but found the inward-looking, hierarchical, high-table society unbearably stuffy."
He found a much more comfortable niche at the Santa Fe Institute, where he worked in the nanotechnology program in the early 1990s, developing concepts for micromachines. He recalls: "We had a grant to make nanobots that would crawl around inside you and repair damage. But let me tell you, if nanobots are ever built, I will not be the first person to volunteer my intestines as their home. They can create a lot more damage than they repair."
He now has an assistant professorship at MIT in the Department of Mechanical Engineering, although today he's taking a break in Santa Monica (his wife teaches Japanese studies at the University of Southern California). He claims to enjoy his leisure time - when he hangs out in coffee houses, plays the flute, or takes long rides on a mountain bike - yet he doesn't seem so laid-back when he starts talking about his work. Quantum computation has become fiercely competitive. When Lloyd first delved into it in 1990, no more than six other theorists in the world were actively involved. Today, he thinks there may be more than a hundred, all of them lured by its incredible potential.
So far, however, quantum computation has not been tested in the laboratory. Lloyd has no way of knowing whether he's on a trail that leads to ultimate computing power, or a dead end.
That must happen to make it work? Matter is made of molecules, and molecules are made of atoms. Each atom has a nucleus at its center, with electrons buzzing around it. In a conventional silicon-based computer, swarms of electrons go skipping down highways of atoms, and the system computes by diverting or containing the flow.
In a quantum computer, there would be no flow: electrons would orbit their home atoms, and each bit of data would be registered by changing the energy level of a single electron.
A bit would be shifted by copying the energy level from one atom to its neighbor, for example, by physically pressing the two atoms together. When two atoms are forced into very close proximity, one can acquire the energy level of the other. David DiVincenzo, at IBM's T. J. Watson Research Center, has proposed using an atomic-force microscope to do this - manipulating single atoms at a rate of maybe 1,000 per second via a sharp-tipped probe. An upper limit might be about 100,000 operations per second: that sounds fast, but it pales when compared with modern CPUs that run at 100 million operations per second.
Seth Lloyd prefers a model in which photons from a laser bombard electrons, flipping them from one state to another. Unfortunately, there's no way to hit just one specific electron, so this would be a shotgun approach: photons would spray indiscriminately over an array of electrons.
How can this model be used for data processing? One way would be to use a long molecule composed of two different kinds of atoms in alternating sequence. The different atoms would have electrons that respond to different frequencies of light. Now add a third type of atom to the end of the chain. Data could be fed in from this entry point, and subsequent blasts of laser light would move the data along the chain in much the same way that food is propelled through the intestines by peristaltic action.
By carefully sequencing light pulses of different frequencies, we can conduct useful data processing. And if the atomic chain consists of a single organometallic-polymer molecule containing as many as a billion atoms, we'd end up with a central processor that could manipulate more data than you'd find in the entire memory of a PC.
So far, so good. But there are big practical problems.
Organometallic polymers can exist only at ultralow temperatures, which means heavy-duty refrigeration equipment would be necessary. To enable the computer user to read the data, the electron states would be sensed by magnetic resonance imaging - the same technique used by hospitals for brain scans - which also requires large, expensive equipment. Worst of all, even under controlled conditions, electrons are liable to change their energy levels unpredictably, causing a quantum computer to corrupt its own data on a random basis. As a result, perhaps 999 cycles out of 1,000 would have to be spent correcting errors.
Lloyd minimizes this problem: "Imagine a whole bunch of bits that are all supposed to be 1. Some of them have deviated, so you survey them and then restore the minority to the value set by the majority."
Not everyone is happy with this scenario. Rolf Landauer, a veteran of microelectronic R&D who was made an IBM fellow in 1969 and still works at the IBM research center in Yorktown Heights, has published half a dozen papers questioning the viability of quantum computation. He is the most notorious skeptic in the field.
"The fact of the matter," he says, "is that if you can build the machinery, and if it is totally undisturbed and works perfectly the way you want it, then you can do what these people would like to do. But machinery is not perfect, and it does not quite do what you want it to do. As for the error correction - the most obvious schemes will introduce quantum-mechanical incoherence. Also, if a computer spends 99.9 percent of its time correcting errors, you'd better be sure that the error-correcting machinery itself is perfect. Why would it be that much easier to make perfect than the rest of the machinery?"
Landauer also points out that the tiniest undetected defect in a crystal polymer can make reliable calculation impossible. And he doesn't see how the system can ever be insulated properly from heat and vibration. "The chance of getting a reliable result," he says, "will diminish exponentially with the length of the computation."
Is he just an elder statesman refusing to listen to the young radicals? Or are enthusiasts like Lloyd so hooked on their dream they're refusing to listen to Landauer's voice of reason?
Lloyd says that when he first started looking for grant money, no one would believe that data could be stored securely on an atomic scale. "But people hadn't really bothered to investigate the error-correcting issues," he says. "I've done a lot of work looking at the early days of computing, when error correction was much more important because computers were built from vacuum tubes. Yes, an atom is less reliable than a transistor, but it's a lot more reliable than a vacuum tube."
Even if his computer has to spend 99.9 percent of its time correcting its own errors, Lloyd believes it will still be vastly more powerful than current systems. Laser light can flip electron states about 10 thousand times as fast as a Pentium chip can switch its microtransistors. Since each pulse of light in a quantum computer can flip maybe a billion bits at a time, the end result (allowing for error correction) would be a system capable of running 100 million times as fast as a Pentium. (For comparison, today's PCs have only about 80 times the processing power of the original IBM PC.)
There are other potential advantages. Quantum computers would be massively parallel, far more powerful than single-processor systems when dealing with heavy-duty calculations. A quantum computer might also be able to crack public key encryption schemes almost instantly - although this is only a quantum-theory prediction, it has never been attempted in practice, and probably won't happen for at least another 20 years.
And perhaps most important, when the energy level of an electron is altered, no waste heat is generated.
This gets around the limiting factor which, until recently, seemed to prohibit computing devices from ever becoming much smaller and faster than they are today. All conventional methods of switching electricity create waste heat, and the smaller a unit is, the more intensely concentrated that heat becomes. Today, microfans are being installed on CPUs to stop them from melting down. Quantum computing would break the "heat barrier" - although error correction would still be a heat source.
Looking ahead, if quantum computation becomes at all viable, it could certainly be used in tomorrow's supercomputers to handle massive tasks such as code breaking or weather forecasting. But let's be bold for a moment and suppose that a smaller, cheaper, simpler way is found to read data out of a molecular array, and the array can be made from a substance stable at room temperature. At this point, the consequences become truly mind-boggling.
It has been calculated that the human brain stores about 10,000 billion bits of information in the cerebral cortex. If this is so, Seth Lloyd's grain of salt could theoretically hold all of a person's memories with room to spare.
Alternatively, you could store the complete texts of a billion books. Online access to reference sources would become irrelevant; each of us could own the Library of Congress, every piece of music ever recorded, plus immaculate digital reproductions of art from every museum in the world. Meanwhile, every domestic device, from a sound system to a hair brush, could possess artificial intelligence at a human level or beyond.
Then Lloyd talks about his subject, he seems genuinely enthralled by it. His manner is erudite, but his voice contains real passion. This raises a more fundamental question: Why does he care so much about computation? Why should number-crunching seem transcendentally important?
"I'm not into quantum computation just because I want to build very fast computers of the next generation," he says. "I'm doing it because I have a general interest in what happens to information at very small scales. For instance, suppose you have a bunch of bacteria that you subject to succesively greater degrees of heat. Some of the bacteria will be rendered incapable of reproducing, but some won't. The net effect is that you breed heat-resistant bacteria.
"You can think of this natural selection as a form of computation," he goes on. "The bacteria are testing different genetic combinations. Some combinations are better. Suppose you have a billion bacteria reproducing every 10,000 seconds with a mutation rate of 10 percent - and the genome contains about 10 billion bits."
It's time again for the yellow legal pad. We're still sitting at the dining table. The cheese sandwich has long since been eaten, the blue sky outside is starting to fade, and the cat stands up and yawns. But Seth Lloyd is wholly preoccupied. He's on another plane, contemplating the sort of mathematics behind natural selection.
"Suppose about 100 bits describe where the mutation takes place and what it consists of. You can view the bacteria as processing 100,000 bits of information per second. And this is just one example. You can think of all the different parts of the world as doing information processing in this way."
So, from Lloyd's perspective, the whole universe is running like a vast network of huge and tiny computers.
I ask how it feels to be so deeply immersed in a continuum of pure numbers. Is it emotionally satisfying?
"Well." He seems a little reluctant to suspend his academic detachment and discuss his feelings. "The work that I do can be amazingly frustrating, because I am often trying to piece together different structures, and it's like trying to assemble pieces from different jigsaw puzzles. It can take days on end. But the feeling when the pieces fit together is truly fine. It's - orgasmic!" He gives a surprised laugh, embarrassed by his own candor. "You know, I often find myself unable to think for days afterward. It's a truly visceral pleasure, discovering something that no one knows." Then he shakes his head ruefully, reigning back his enthusiasm with a bit of caution. "Of course, most of the time, you end up discovering things that people already know. Or, your discovery turns out to be of limited practical use."
What are the chances of this happening with quantum computing? Will it turn out to be of little import, after all? Or will the
science behind it become so cost-effective that each of us ends up owning vast parts of the world's total store of information?
Lloyd gestures to the dozens of scattered pen tracks that have accumulated on the yellow pad during our conversation: numbers, symbols, drawings, dashes, and little pictures of twirling electron orbits. "The physics of what we're doing works out fine," he says, speaking slowly, with the caution of a scientist who wants to be sure every step he takes is fully considered. "But when you try to take something out of a laboratory and into mass production, the vast majority of prototechnologies turn out not to work." He shrugs. "Personally, I refuse to promise anything. But I do know that this will be an interesting adventure."
Why "Quantum"?
Consider the quixotic behavior of atomic particles. According to Heisenberg's uncertainty principle, below a certain level, you can never know precisely where an electron is because it behaves as if it's in many places at once. But you can detect and change the amount of energy that an electron possesses.
Imagine yourself holding one end of a rope, with the other end anchored to a wall. You start shaking your arm to make waves in the rope. If you move your arm slowly, the rope contains only a single wave. If you put in more energy by shaking the rope faster, two waves appear, oscillating around a center point. Faster still, and the rope divides itself into three, four, or more vibrating waves.
The elusive nature of electrons means that they behave, in a way, like waves. Think of an electron "vibrating" around the nucleus of an atom. If you bombard it with photons (particles of light), you add energy, so it vibrates faster. This is not the kind of smooth transition that occurs when you gradually warm a room with a heater. The electron jumps from one energy state to the next with no fractional levels in between, just as a smoothly vibrating rope can contain one wave or two, but not a fraction of a wave.
The energy states of an electron are called "quantum states" because on the atomic scale, energy exists in whole units known as "quanta." Similarly, on the most fundamental level, digital computers use zeroes and ones with no fractional states in between. Therefore, it seems ideal to use a low electron energy state to represent numeral 0 and a higher energy state to represent numeral 1.
Unfortunately, an electron isn't a stable place to store data. Its energy state may be affected by heat, vibration, and other outside interference; or the electron may spontaneously reduce its energy state by emitting a photon.
These problems may be overcome, but it will take another two or three years to test the basic concepts with laboratory experiments. And even if the experiments are a success, we could easily wait two decades before seeing quantum computers for sale to the general consumer.