New Processor Design Approach Trims Chips, Boosts Speed
The basic idea behind the study was to reduce chip power consumption by allowing a higher percentage of calculation errors, then develop smart chip management functions that mitigate the impact of those errors.
March 18, 2011
Green computing used to be an idea that many CIOs and IT leaders paid little attention to, especially when computing power was cheap and energy supplies were abundant. Thing have changed dramatically over the last few years, thanks to skrocketing fuel and energy costs and ever-increasing demand for compute resources and storage. Green IT is no longer a "nice to have" consideration, especially when IT budgets are under more pressure than ever before.
Read: More Business Technology News and Articles
So news that a team of microprocessor experts have developed a method to double the efficiency of processors simply by cutting out portions that are rarely used could potentially lead to big changes in the tech industry. Rice University broke the news of the discovery earlier this week, announcing that a team of chip experts from Singapore, Switzerland, and the U.S. had discovered that "while these 'pruned' microchips make a few calculation errors, tests show that cleverly managing the errors can yield chips that are two times faster, consume about half the energy and take up about half the space of traditional microchips."
The basic idea behind the study was to reduce chip power consumption by allowing a higher percentage of calculation errors, then develop smart chip management functions that mitigate the probability of errors and limit which calculations produce errors. These chips then consume much less power, occupy less physical space, and run faster than traditional processors.
The experts behind the research argue that this new chip design approach could work well for certain applications that can tolerate higher error percentages, such as chips used to process audio and video. "The cost for these gains is an 8 percent error magnitude, and to put that into context, we know that many perceptive types of tasks found in vision or hearing applications can easily tolerate error magnitudes of up to 10 percent," said Christian Enz, a participant and co-author of the study.
The researchers hope to have a complete prototype for application in a hearing aid by the end of 2011, and future applications could follow. "Based on what we already know, we believe probabilistic computing can produce application-specific integrated circuits for hearing aids that can run four to five times longer on a set of batteries than current hearing aids," said principal investigator Krishna Palem, the Ken and Audrey Kennedy Professor of Computing at Rice University in Houston.
Does the thought of more error-prone (but more efficient) processors give you pause, or do you think the idea has some potential? Let me know by adding a comment to this post of continuing the discussion on Twitter.
Follow Jeff James on Twitter at @jeffjames3
Follow Business Technology on Twitter at @BizTechMag
Related Content:
Datapipe Goes Green in the Cloud
About the Author
You May Also Like