There have been various observations about the history and trendline of certain characteristics of computing devices. The most famous of which is Moore’s Law, which will start off a list of a select few of these “laws” (usually named after a specific person):
- Moore’s Law: The number of transistors placed on an integrated circuit that minimizes the cost per transistor doubles about every 2 years.
- Koomey’s Law: The number of computations per joule of energy dissipated in computing systems doubles every 1.57 years.
- Bell’s Law of Computer Classes: Every decade or so a new class of computing devices emerges based on lower cost components which disrupts the existing computing paradigm.
- Kryder’s Law: The areal storage density of magnetic disks, used in hard drives, doubles every year.
- Grosch’s Law: Computer performance increases as the square of the cost.
- Haitz’s Law: Every decade, the cost per lumen of LEDs falls by a factor of 10.
- Rock’s Law: The cost to construct and outfit a semiconductor fabrication plant doubles every 4 years.
- Nielsen’s Law: The network connection speed for high-end home users doubles every 21 months.
- Evan’s Law: Due to automation, biotech costs fall by 3× every year
One site I occasionally check out is Trends in Computing created and maintained by Rik Blok from UBC, where you can track the exponential progress of computing technology in real time (on the scale of months).
Edit (March 1, 2015): Added Evan’s Law