Neuromorphic Engineering Book
  • Welcome
  • Preliminaries
    • About the author
    • Preface
    • A tale about passion and fear
    • Before we begin
  • I. Introduction
    • 1. Introducing the perspective of the scientist
      • From the neuron doctrine to emergent behavior
      • Brain modeling
      • Take away lessons
    • 2. Introducing the perspective of the computer architect
      • Limits of integrated circuits
      • Emerging computing paradigms
      • Brain-inspired hardware
      • Take away lessons
      • Errata
    • 3. Introducing the perspective of the algorithm designer
      • From artificial to spiking neural networks
      • Neuromorphic software development
      • Take home lessons
  • II. Scientist perspective
    • 4. Biological description of neuronal dynamics
      • Potentials, spikes and power estimation
      • Take away lessons
      • Errata
    • 5. Models of point neuronal dynamic
      • Tutorial - models of point neuronal processes
        • The leaky integrate and fire model
        • The Izhikevich neuron model
        • The Hodgkin-Huxley neuron model
      • Synapse modeling and point neurons
      • Case study: a SNN for perceptual filling-in
      • Take away lessons
    • 6. Models of morphologically detailed neurons
      • Morphologically detailed modeling
      • The cable equation
      • The compartmental model
      • Case study: direction-selective SAC
      • Take away lessons
    • 7. Models of network dynamic and learning
      • Circuit taxonomy, reconstruction, and simulation
      • Case study: SACs' lateral inhibition in direction selectivity
      • Neuromorphic and biological learning
      • Take away lessons
      • Errate
  • III. Architect perspective
    • 8. Neuromorphic Hardware
      • Transistors and micro-power circuitry
      • The silicon neuron
      • Case study: hardware - software co-synthesis
      • Take away lessons
    • 9. Communication and hybrid circuit design
      • Neural architectures
      • Take away lessons
    • 10. In-memory computing with memristors
      • Memristive computing
      • Take away lessons
      • Errata
  • IV. Algorithm designer perspective
    • 11. Introduction to neuromorphic programming
      • Theory and neuromorphic programming
      • Take away lessons
    • 12. The neural engineering framework
      • NEF: Representation
      • NEF: Transformation
      • NEF: Dynamics
      • Case study: motion detection using oscillation interference
      • Take away lessons
      • Errate
    • 13. Learning spiking neural networks
      • Learning with SNN
      • Take away lessons
Powered by GitBook
On this page

Was this helpful?

  1. I. Introduction
  2. 2. Introducing the perspective of the computer architect

Take away lessons

PreviousBrain-inspired hardwareNextErrata

Last updated 3 years ago

Was this helpful?

Transistor: Nano-scale switches for electrons in which electrons are driven from the transistor’s drain to source and controlled by its gate’s voltage.

Moore's Law: Anticipating an exponential growth in computing power at an annual rate of 50% (doubling performance every two years).

The demise of Moore Law. See below the computing performance calculated year-to-year from 1978 to 2018, compared to the VAX 11/780 computer, using standard numerical computation benchmarks. From 1986 to 2002, we have witnessed exponential computing power growth at an annual rate of 52%. Since 2003, single processors’ performance has slowed down, growing at the rate of only 25% a year. From 2011 to 2015, the annual improvement was 12%, and since 2015, with the demise of Moore’s law, the recorded improvement rate slowed down to 3.5%. Overall, from doubling performance every two years during the 1980s, we are currently doubling performance every 20 years.

Clock rate and power density. Traditional computing architectures took the faster way toward improvement by scaling up existing designs. This has resulted in increased power densities and clock frequencies. Optimized over many years of evolution, the brain operates at a much slower rate and power density while offering comparable computing performance.

Error probability and signal energy tradeoff. While a digital computer would invest 220 kT for each signal it generates, the brain will invest only 1.72 kT. This resulted in an error probability of 10−24 for the computer and 0.65 for the brain. When signal energy is less than 3 kT, the signal carries less than 1 bit of information. kT stands for thermal noise.

Energy efficiency and signal energy tradeoff. While a digital computer would invest 220 kT for each signal it generates, the brain will invest only 1.72 kT. This resulted in an energy efficiency of 0.26 for the brain and 0.004 for the computer. The brain is therefore 59× more efficient than the computer. kT stands for thermal noise.

Energy and precision. While energy consumption for analog signals scales quadratically with precision, its scales logarithmically for digital signals. A neuromorphic hybrid analog-digital computer is argued to achieve linear correlation, achieving superior performance over 5 scales of precision.

von Neumann architecture: A prominent computer architecture in which the processor is separated from memory, creating a memory wall.

In-memory computing: A computing architecture in which memory, computation, and communication are tightly integrated and distributed over a communication fabric.