Geek Page
Reversible Logic
Back in the 1960s, an IBM researcher named Rolf Landauer discovered that the process of erasing a bit of information dissipates a small amount of energy as heat. According to what came to be called Landauer's principle, computing data doesn't waste energy - erasing data wastes energy. And whether you mean to or not, your computer erases a lot of bits, limiting the amount of computation you can perform. Why? Because the logical operations cannot be performed in reverse.
But Landauer also postulated that this inefficiency in circuit design resulted more from coincidence than from any physical law: computers were built with irreversible logic because it was easier to design a circuit that trashed unwanted data than it was to design a closed system that would reuse it. Landauer's work eventually led to the development of reversible computing - a system that recycles leftover bits and their energies instead of dissipating them as heat.
Chip logic
A computer processor is a complex digital circuit composed of millions of logic gates. These gates manipulate electrical voltages, allowing the computer to direct them in different ways and perform computations. Logic gates are made of transistors that convert inputs to outputs according to a set function, usually a Boolean logic operation (i.e., and, or, not).
Whatever voltages remain in the transistors must be cleared after the inputs are pushed through the gate, to reset it for the next operation. This leftover voltage is purged through a transistor's ground and released as heat.
Take the "and" gate, for example. Each of these gates has two inputs and one output. Each input can be represented as either a 1 or a 0 (voltage or no voltage). If both inputs are 1, then the output is also 1. For all other input pairs - (1,0), (0,1), and (0,0) - the output is zero.
Because of this architecture, three input pairs - (1,1), (1,0), (0,1) - leave voltage in the circuit. These voltages must be grounded to clear the gate's transistors for the next computation, so the information is erased - and the system is logically irreversible. According to Landauer's principle, the erasure of information (voltages as bits) dissipates heat. Every irreversible computer works this way, and each one wastes energy.
Thinking backward
Today's integrated circuit designers are rapidly increasing the number of transistors on a chip, while simultaneously decreasing their size: Intel's 120-MHz Pentium, for instance, houses 3.3 million transistors. There are physical limits, however, where the transistors become too delicate to be packed any closer together - any electrical current will melt the circuits. One answer is to build reversible circuits. By recycling the voltages in the wires, these systems erase no information and generate no heat.
While at Caltech in 1974, Ed Fredkin, a researcher in the physics of computation, designed a reversible gate that can perform any logical operation. The device has three inputs and three outputs. One input acts as a control and is left unaffected by the gate. The other two inputs are processed according to Boolean operations. Since the gate has the same number of inputs as outputs, it never needs to purge a transistor.
But how does a reversible gate recycle information and energy? The trick is to build a circuit that can work its way backward. After a computation, the output data is saved, then run backward through a second circuit that mirrors the first. Given the density of today's microprocessors, computational leftovers pile up extremely quickly. Repeating a computation backward effectively removes all these leftover voltages by absorbing them into the reverse computation: all the voltages that would normally be purged and released as heat are recycled, leaving only the set of original inputs in their original positions.
The reversible circuit can be thought of as a gigantic piece of plumbing. But after a computational cycle, the "dirty" water is reclaimed instead of flushed into a river. Recycling the computation might require twice as many gates, but this inconvenience is offset by the benefits of reduced heat.
Looking forward
Though irreversible logic operations are the only aspect of integrated circuits designed to waste energy, there are other flaws in modern chip design - noise and inconsistencies in output voltages, for instance - that generate amounts of heat millions of times greater than those due to irreversibility. So although it is currently possible to craft logically reversible circuits, the inefficiencies in the processor's physical layer must be sewn up first.
Such is the thinking behind the research into adiabatic switching by the ACMOS group at the University of Southern California's Information Sciences Institute (www.isi.edu/acmos/intro.html). Adiabatic switching turns transistors on and off in a consistent manner to diminish the heat dissipated as a result of large voltage drops. It's a hack on classical chip design that brings the physical components up to peak efficiency, which makes the energy savings garnered by reversible logic significant.
While the research being done to minimize heat dissipation should pay off big in today's computers, it also points to next-generation computing architectures. Unlike irreversible computing, which interacts with its environment by releasing heat, a reversible system is closed. Any quantum computer - which relies on quantum mechanical operations to factor large numbers faster than conventional computers - requires a closed system and thus has to be reversible.
Regardless of the specific technology, the demand for more powerful computers will lead to reversible computing. As Fredkin opines, "At the beginning of the 22nd century, a US$1 million computer will weigh 5 pounds and will be able to do the equivalent of 10³ (one million trillion trillion) operations per second. Such a computer would have to be reversible, otherwise it would heat up at more than a billion degrees Fahrenheit per second." And that's hot.
_________________
Jesse Freund (freund@wired.com) is Wired*'s intelligent agent.*