Make mistakes on silicon, save energy? by Doug Mohney
Rice University is touting its "inexact" computer chip. The design improves power and resource efficiency by allowing for occasional errors. Prototypes show off at the ACM International Conference on Computing Frontiers are at least 15 times more efficient than today's technology -- this is the sort of thing that makes data center buyers sit up and take notice, assuming they don't need high levels of precision.
Researchers started looking at slashing power usage by allowing processing components , like hardware for adding and multiplying numbers, to make a few mistakes. The probability of errors is managed and by limiting which calculations produce errors, designers found they can simultaneously cut energy demands and boost performance.
Performance gains come from trimming away some of the rarely used portions of digital circuits. In initial 2011 simulation runs, "pruned" chips were twice as fast, used half as much energy and were half the size as a traditional chip.
The Rice team decided to go farther and actually make some real-world silicon, finding that they could cut energy demands 3.5 times by allowing deviations from a correct value by an average of 2.5 percent. Factoring in size (smaller) and speed gains (faster), the chips were 7.5 times more efficient than regular chips. Chips allowing a larger deviation of about 8 percent were up to 15 times more efficent.
Researchers say particular types of applications can tolerate quite a bit of error. The human eye has a built-in mechanism for error correction, so relative errors up to 0.54 percent on images processed by inexact chips were almost indiscernible -- you had to go looking for them.
Initial applications for inexact chips are expected to go into application-specific processors, such as chips used in hearing aids, cameras, and other portable electronic devices. The hardware is also getting rolled into India's low-cost I-slate education tablet, with use of inexact chips in I-slates and prototype hearing ads expected to appear by 2013.
How this rolls into green data center operations is an interesting discussion. For HVAC-style "smart" controls, being off by 0.54 percent of a degree is nothing -- but the savings on an overall energy management system would be considerable. Server farms processing voice/audio, video, and images are likely to be among the first beneficiaries, since there's already a certain amount of "fudge" when you start moving from raw data into compressed formats for faster delivery and to conserve bandwidth.
I'm willing to bet the ARM-crowd -- which is already a bunch of roll-your-own-types already -- will be having discussions on how to best incorporate energy-efficient DSP processing on chip as the first step, followed by a more in-depth examination of the baseline ARM design to see if it can be "pruned" to deliver better performance-per-watt in exchange for the loss of a decimal place or two in overall accuracy.