Boffins take the brain out of the computer -

Scientists have confirmed that one of the best ways for a computer system to save cash is to move to approximate computing.

Purdue University has been conducting a study into "approximate computing". This aims to perform calculations good enough for certain tasks that don't require perfect accuracy. Rounding up or down has the ability to double efficiency and reduce energy consumption.

Anand Raghunathan, a Purdue Professor of Electrical and Computer Engineering, said computers were first designed to be precise calculators that solved problems where they were expected to produce an exact numerical value.

Mobile and embedded devices need to process richer media, and are getting smarter, being more context-aware and having more natural user interfaces.

Most software is designed to tolerate "noisy" real-world inputs and use statistical or probabilistic types of computations.

Most of these do not need a precise answer. In some since there is no golden answer, or you are trying to provide results that are of acceptable quality, you don't need to be perfect.

At the moment today's computers are designed to compute precise results even when it is not necessary

An example given is that if you were told to divide 500 by 21 and I asked if the answer was greater than one, you could answer yes without needing to do all the detailed maths.

Purdue researchers have developed a range of hardware techniques to demonstrate approximate computing, showing a potential for improvements in energy efficiency.

The researchers have shown how to apply approximate computing to programmable processors. They have also changed the "instruction set," which is the interface between software and hardware.

The trick is to use "quality fields" to tell the hardware the level of accuracy needed for a given task. They have created a prototype programmable processor called Quora based on this approach.

In other recent work, the Purdue team came up with an approximate "accelerator" for recognition and data mining.

Apparently the technology is nearly there, more or less.