Adventurous Probabilistic Hardware To Pave the Way for Faster Computers with Energy Savings

Georgia Tech Professor's Probabilistic Bits Holds Great Promise for Industry

Imagine using a cell phone for hundreds of hours without recharging it. In the hope of making this dream a reality, researchers at the Georgia Institute of Technology have made a surprising discovery that may dramatically reduce power consumption of semiconductors while simultaneously increasing speed. The discovery of probabilistic bits or PBITS shows great promise for making a major impact on the semiconductor industry, which constantly seeks new generation designs and materials to increase processing speeds, reduce power consumption and sustain Moore's Law-the doubling of transistors every couple of years. The Defense Advanced Research Projects Agency (DARPA), the central research arm of the U.S. Department of Defense, funded this research effort in relation with DARPA's Power Aware Computing and Communications (PACC) program.

A PBIT is like a conventional bit in that it takes on a 0 or a 1 value, except that one is certain of its value only with a probability of p. Current hardware, using conventional bits, expends large amounts of energy calculating with absolute certainty, even when running software using probability algorithms, as widely used in cryptography. Dr. Krishna Palem inspired by the lectures of celebrated physicist Richard Feynman shows that, in return for living with the uncertainty of PBITS, the computing element calculates the value with less energy. He also shows that the higher the desired value of the probability p, the greater the energy needed to produce the corresponding PBIT.

"The initial 'probabilistic algorithm' work focused on software and on the time required to complete a computation. With computer technology now starting to brush up against its physical limits, other, more physical considerations have become increasingly important. Palem has carried over the 'probabilistic algorithm' idea to this new setting, on the basis of two key ideas: One, 'probabilistic algorithms' can run directly on probabilistically reliable hardware (or hardware that takes chances) rather than on regular hardware, which goes to all lengths to guarantee the absolute correctness of its computation. Two, the 'noise' (i.e. physical 'static') inherent in hardware can be used as a 'no cost' source of the randomization needed to make algorithms reliably probabilistic," says Jack Schwartz, professor of Mathematics and Computer Science, Courant Institute, New York University and member of both the National Academy of Sciences and the National Academy of Engineering.

Palem, who is a joint professor in the Georgia Tech College of Computing and the School of Electrical and Computer Engineering and director of the Center for Research in Embedded Systems & Technology, applies the concept of probability-already well-established in software applications-to the hardware side of embedded chips. These are tiny microprocessors without keyboards that regulate many appliances and have been central to the increased efficiencies and miniaturization of a wide range of devices such as automobiles, cell phones, and Personal Digital Assistants (PDAs).

Seemingly counterintuitive, adding probability to computations has long been known to increase application speed. Palem brings this concept to the hardware level for the first time, possibly resulting in a new type of semiconductor device. Recent results from simulations using Palem's PBITS framework are even better than expected.

"We ran simulations using a 'spoken alphabet' voice recognition application as seen in many cell phones applications," says Palem. "I would have been happy with energy improvements by a factor of 30 or 40. In some cases, I was astounded when we compressed power usage by a factor of 1,000," says Palem.

In January, the research team demonstrated simulations of this framework at a DARPA/PAC/C principal investigators meeting at Hilton Head Island, and these simulations are readily accessible on the Web for researchers to verify at In April, Palem presented even newer, stronger results to industry representatives at the Cool Chips conference in Japan. Palem's team has applied for a patent for this work.

It turns out that the more complex the application, the greater the power savings. The research team ran simulations on simple deterministic algorithms, genetic algorithms, and the most complex cognitive or case-based reasoning applications. Since devices and applications are becoming more complex each year, this is great news for industry.

Next Steps
This summer the team will test the proof-of-concept device when it returns from the fabrication house and will work on implementing it into a computing tile as a building block of computing architecture to enable more ambitious applications, including those with a cognitive flavor, such as those commonly used for financial analysis and general risk analysis.

"Dr. Palem's findings are remarkable at this stage. The next phase, namely demonstrating the feasibility of building basic elements using the principles he has developed, will pave the way towards starting to build units and then systems that exploit these capabilities. There are many challenges in the way, of course, but the potential rewards from the efforts are immense," says Jaime H. Moreno, senior manager, Computer Architecture Department, IBM T.J. Watson Research Center.

Later this year, Palem will publish an article on his work in the upcoming special issue on "Energy Efficient Computations" of the prestigious journal IEEE Transactions on Computers. Palem continues to share his findings with other researchers to gain further support for this promising new approach to chip design.

"Palem's recent work on probabilistic computations is likely to have a significant impact on research in computing at large. The theory establishes a deep link between classical thermodynamics and the switching energy of circuits, through models of randomized computing. How to best turn energy and unreliable components into reliable computing is an old question of Von Neumann. Krishna's theory is a major step towards answers, achieved by connecting more parts of the puzzle than ever before," says Jean Vuillemin, professor of Computer Science at École Normale Supérieure de Paris. "Yet, the most fun is yet to come when Krishna's theory finds practical applications. And the time is ripe. Energy considerations already dominate electronic design. Soon, manufactured transistors must be unreliable. Will we yet find efficient ways to use them?"