Take away lessons
Last updated
Last updated
Transistor: Nano-scale switches for electrons in which electrons are driven from the transistor’s drain to source and controlled by its gate’s voltage.
Moore's Law: Anticipating an exponential growth in computing power at an annual rate of 50% (doubling performance every two years).
The demise of Moore Law. See below the computing performance calculated year-to-year from 1978 to 2018, compared to the VAX 11/780 computer, using standard numerical computation benchmarks. From 1986 to 2002, we have witnessed exponential computing power growth at an annual rate of 52%. Since 2003, single processors’ performance has slowed down, growing at the rate of only 25% a year. From 2011 to 2015, the annual improvement was 12%, and since 2015, with the demise of Moore’s law, the recorded improvement rate slowed down to 3.5%. Overall, from doubling performance every two years during the 1980s, we are currently doubling performance every 20 years.
Clock rate and power density. Traditional computing architectures took the faster way toward improvement by scaling up existing designs. This has resulted in increased power densities and clock frequencies. Optimized over many years of evolution, the brain operates at a much slower rate and power density while offering comparable computing performance.
Error probability and signal energy tradeoff. While a digital computer would invest 220 kT for each signal it generates, the brain will invest only 1.72 kT. This resulted in an error probability of 10−24 for the computer and 0.65 for the brain. When signal energy is less than 3 kT, the signal carries less than 1 bit of information. kT stands for thermal noise.
Energy efficiency and signal energy tradeoff. While a digital computer would invest 220 kT for each signal it generates, the brain will invest only 1.72 kT. This resulted in an energy efficiency of 0.26 for the brain and 0.004 for the computer. The brain is therefore 59× more efficient than the computer. kT stands for thermal noise.
Energy and precision. While energy consumption for analog signals scales quadratically with precision, its scales logarithmically for digital signals. A neuromorphic hybrid analog-digital computer is argued to achieve linear correlation, achieving superior performance over 5 scales of precision.
von Neumann architecture: A prominent computer architecture in which the processor is separated from memory, creating a memory wall.
In-memory computing: A computing architecture in which memory, computation, and communication are tightly integrated and distributed over a communication fabric.