Boffins have managed to prove that the energy efficiency of computers doubles roughly every 18 months.
Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University has more data providing his law than you can poke a stick at.
Koomey's Law mirrors Moore's law, which states that computer processing power doubles about every 18 months. However Koomey thinks his law is a little more important as battery-powered devices start to spread.
Koomey said that the idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half. More mobile computing and sensing applications become possible as energy efficiency continues its steady improvement.
His research, which was conducted in collaboration with Intel and Microsoft, looked at peak power consumption of electronic computing devices since the construction of the Electronic Numerical Integrator and Computer (ENIAC) in 1956.
ENIAC was used to calculate artillery firing tables for the U.S. Army, and it could perform a few hundred calculations per second. It took up 1,800 square feet, and consumed 150 kilowatts of power.
Even before transistors, Koomey says, energy efficiency doubled every 18 months. He thinks that his law is a characteristic of information technology that uses electrons for switching.
Engineering considerations that go into improving computer performance such as reducing component size, capacitance, and the communication time between them, among other things — also improves energy efficiency, Koomey says.