MIT chip gives 'sort-of-wrong' answers -

A computer chip that performs imprecise calculations can process some types of data thousands of times more efficiently than existing chips.

The discovery arises through an MIT research project into human speech acquisition which has involved combing through more than 100,000 hours of video for, say, every instance in which a child or its caregivers says 'ball', together with all interactions with actual balls.

MIT Media Lab researcher Deb Roy and visiting professor Joseph Bates reasoned that, as algorithms for processing visual data are often error-prone, increasing the margin of error very slightly probably wouldn’t compromise performance too badly.

And, they reckoned, the tradeoff would be the ability to do thousands of computations in parallel.

The team began by evaluating an algorithm used in object-recognition systems to distinguish foreground and background elements in frames of video.

They rewrote it so that all its results were either raised or lowered by a randomly generated factor of between 0 and 1 percent. Performance was compared with the standard algorithm.

"The difference between the low-precision and the standard arithmetic was trivial," says graduate student George Shaw. "It was about 14 pixels out of a million, averaged over many, many frames of video. No human could see any of that."

The chip has a thousand cores, much smaller than conventional ones as they don’t have to provide perfectly precise results. Each core can communicate only with its immediate neighbors, making it much more efficient.

The new chip design looks to be particularly compatible with image and video processing. But consultant Bob Colwell - formerly chief architect on several of Intel’s Pentium processors - says the most promising application could be in human-computer interactions.

"There’s a lot of places where the machine does a lot of work on your behalf just to get information in and out of the machine suitable for a human being," he says.

"If you put your hand on a mouse, and you move it a little bit, it really doesn’t matter where exactly the mouse is, because you’re in the loop. If you don’t like where the cursor goes, you’ll move it a little more. Real accuracy in the input is really not necessary."

Bates says his chip would work with a standard processor, handling targeted, labour-intensive tasks. But Colwell says that, depending on how it's constructed, there could be some difficulty integrating it with existing technologies - as well as psychological barriers.

"There’s going to be a fair amount of people out in the world that as soon as you tell them I’ve got a facility in my new chip that gives sort-of wrong answers, that’s what they’re going to hear no matter how you describe it," he says. "That’s kind of a non-technical barrier, but it’s real nonetheless."