Overheating is a big issue with today's computers, but those of the future may be able to keep cool by avoiding a canonical information processing limit.
Computers, in case you hadn't noticed, are quite hot—literally. A laptop can generate thigh-boiling heat, while data centers use an estimated 200 terawatt-hours of electricity each year, which is similar to the energy consumption of several medium-sized countries. The overall carbon footprint of information and communication technology is comparable to that of aircraft fuel use. And as computer circuitry gets smaller and more tightly packed, the energy dissipates as heat causes it to melt more easily.
Now, physicist James Crutchfield of the University of California, Davis, and his doctoral student Kyle Ray have developed a new approach to computing that produces a fraction of the heat produced by traditional circuits. In fact, their method, which was recently published as a preprint paper, could reduce heat dissipation to even lower levels than the theoretical minimum imposed by physics on today's computers. This might drastically lower the amount of energy required to execute computations as well as keep circuits cool. The researchers claim that all of this may be accomplished with existing microelectronic components.
In 1961, at IBM's Thomas J. Watson Research Center in Yorktown Heights, N.Y. scientist Rolf Landauer demonstrated that traditional computation has an inescapable cost in energy dissipation—basically, the development of heat and entropy. This is because a traditional computer must occasionally destroy bits of data in its memory circuits to create room for more. A certain amount of energy is dissipated each time a single bit (with the value 1 or 0) is reset, which Ray and Crutchfield have dubbed "the Landauer."Its value varies depending on the temperature: one Landauer would be roughly 10–21 joule in your living room. (A lit candle, for example, emits about 10 joules of energy every second.)
Landauer's constraint on how little heat a computation produces can be circumvented by not destroying any data, as computer scientists have long realized. Because no information is lost, each step may be retraced, a computation done this way is totally reversible. It may appear that this procedure would swiftly exhaust a computer's memory. However, in the 1970s, T. J. Watson's Charles Bennett demonstrated that instead of discarding information at the end of the computation, it could be set up to "decompute" intermediate results that were no longer needed by reversing their logical stages and returning the computer to its original state.
The hitch is that in order to avoid transferring any heat—what physicists refer to as an adiabatic process—the computation's succession of logical operations must normally be carried out endlessly slowly. In certain ways, this strategy avoids any "frictional heating" in the calculation, but at the risk of taking an endless amount of time to finish.
So that doesn't appear to be a viable option. "For a long time, the common thinking has been that with reversible computing, energy dissipation is proportionate to speed," says computer scientist Michael Frank of Sandia National Laboratories in Albuquerque, N.M.
##TO THE END OF THE LINE—AND THEN SOME
Silicon-based computing isn't even close to the Landauer limit: per logical operation, it generates approximately a few thousand Landauers in heat, and it's hard to imagine how even the most energy-efficient silicon chip of the future could drop below 100. However, Ray and Crutchfield claim that by encoding information in electric currents in a different method, rather than as charge pulses, they can get greater results. They claim that this would allow for reversible computing without sacrificing speed.
Last year, the two physicists and their colleagues proposed the core concept of momentum computing. The key idea is that a bit-encoding particle's momentum can act as a kind of "free" memory since it contains information about the particle's past and future movements, not just its current state. "In the past, information was stored in a positional sense: 'Where is the particle?'" Crutchfield explains. Is an electron in this channel or that channel, for example? "Momentum computing makes use of position and velocity information," he explains.
This additional data can then be used to do reversible computing. The logical processes must happen considerably faster than the time it takes for the bit to come into thermal equilibrium with its surroundings, or the bit's motion will be randomized and the information will be scrambled. In other words, "momentum computing" necessitates a high-speed gadget, according to Crutchfield. "You must compute fast"—that is, nonadiabatically—for it to work.
The researchers examined how to apply the concept to a logical process known as a bit swap, in which two bits' values are swapped at the same time: 1 becomes 0 and vice versa. No information is discarded; it is simply reorganized, implying that it has no erasing cost in theory.
However, if the information is only encoded in a particle's position, a bit swap—say, swapping particles between the left and right channels—scrambles their identities, making it impossible to discern between their "before" and "after" states. However, if the particles have opposite momenta, they will remain distinct, resulting in a genuine and reversible change.
##A USEFUL APPARATUS
Ray and Crutchfield have detailed how this concept could be put into practice in a practical device—specifically, in superconducting flux quantum bits, or qubits, which are the standard bits used in most modern quantum computers. "The quantum computing community is parasitic on us!" Crutchfield grins as he acknowledges. These devices are made up of superconducting loops interrupted by Josephson junctions (JJs), which are made up of a small layer of nonsuperconducting material sandwiched between two superconductors.
JJ circuits typically encode information in the direction of their so-called supercurrent circulation, which may be switched using microwave radiation. However, because supercurrents carry momentum, they can also be employed to compute momentum. According to Ray and Crutchfield's calculations, JJ circuits should be able to support their momentum computing approach under specific situations. The circuitry could do a single bit-swap operation in less than 15 nanoseconds if cooled to liquid helium temperatures.
"Our concept is considerably more generic than that," Crutchfield says, "but it is founded in a specific substrate to be as concrete as possible and to precisely predict the required energies." It should theoretically function with regular (but cryogenically cooled) electronic circuits or even microscopic, well-insulated mechanical devices that can carry momentum (and so do computation) in their moving parts. Crutchfield believes that a method based on superconducting bits would be particularly well suited since "it's familiar microtechnology that is known to scale up quite effectively."
Crutchfield should know: Crutchfield has already evaluated the cost of deleting one bit in a JJ device and proven that it is close to the Landauer limit, working with Michael Roukes and his coworkers at the California Institute of Technology. Crutchfield and Roukes even served as advisors for IBM's attempt to develop a reversible JJ computer in the 1980s, which was later abandoned due to too rigorous fabrication requirements at the time.
##FOLLOW THE BALL AS IT BOUNCES.
Using the velocity of a particle to compute isn't a novel concept. Momentum computing is similar to a reversible-computing concept called ballistic computing that was proposed in the 1980s: in it, information is encoded in objects or particles that move freely through the circuits under their own inertia, carrying a signal that is used repeatedly to enact many logical operations. When a particle interacts with others in an elastic manner, it does not lose any energy in the process.
Once the ballistic bits have been "launched," they are the only source of energy for the computation of such a device. As long as the bits continue to bounce along their trajectories, the computation is reversible. When their states are read out, information is deleted and energy is dissipated.
While a particle's velocity in ballistic computing simply moves it through the device, allowing it to ferry information from input to output, Crutchfield claims that a particle's velocity and position together allow it to represent a unique and unambiguous sequence of states during a computation. He goes on to say that this last condition is crucial for reversibility and consequently minimal dissipation since it may tell exactly where each particle has been.
For decades, researchers like Frank have been working on ballistic reversible computing. One issue is that, in its current form, ballistic computing is dynamically unstable because, for example, particle collisions may be chaotic and hence very susceptible to even the tiniest random fluctuations, and thus cannot be reversed. However, researchers have made headway in solving the issues. Kevin Osborn and Waltraut Wustmann, both of the University of Maryland, proposed in a recent preprint paper that JJ circuits could be used to create a reversible ballistic logical circuit known as a shift register, in which the output of one logic gate becomes the input of the next in a series of "flip-flop" operations.
According to Osborn, "superconducting circuits are a useful platform for evaluating reversible circuits." His JJ circuits, he adds, appear to be quite similar to those proposed by Ray and Crutchfield, and hence may be the ideal option for putting their theory to the test.
"All of our organizations have been working on the assumption that these methods can provide a better trade-off between efficiency and speed than standard approaches to reversible computing," Frank explains. "At the level of theory and simulation of particular devices," Ray and Crutchfield "have probably done the most complete job so far of establishing this." Nonetheless, Frank cautions that all of the different methods of ballistic and momentum computing are "still a long way from becoming a realistic technology."
Crutchfield, on the other hand, is more upbeat. He says, "It really hinges on convincing people to support ramping up." He believes that compact, low-dissipation momentum-computing JJ circuits will be available in a few years and that full microprocessors would be available by the end of the decade. In the end, he believes consumer-grade momentum computing will achieve 1,000-fold or more energy efficiency advantages over present technologies. "Imagine if your Google server farm, which is situated in a massive warehouse and consumes 1,000 kilowatts for computation and cooling, was lowered to just one kilowatt—equivalent to numerous incandescent light bulbs," says the author. According to Crutchfield,
But, according to Crutchfield, the advantages of the new technique could go beyond a tangible reduction in energy expenditures. "Momentum computing will cause a paradigm shift in how we think about information processing in the world," he argues, referring to how information is processed in biological systems.