Neuromorphic Engineering Book
  • Welcome
  • Preliminaries
    • About the author
    • Preface
    • A tale about passion and fear
    • Before we begin
  • I. Introduction
    • 1. Introducing the perspective of the scientist
      • From the neuron doctrine to emergent behavior
      • Brain modeling
      • Take away lessons
    • 2. Introducing the perspective of the computer architect
      • Limits of integrated circuits
      • Emerging computing paradigms
      • Brain-inspired hardware
      • Take away lessons
      • Errata
    • 3. Introducing the perspective of the algorithm designer
      • From artificial to spiking neural networks
      • Neuromorphic software development
      • Take home lessons
  • II. Scientist perspective
    • 4. Biological description of neuronal dynamics
      • Potentials, spikes and power estimation
      • Take away lessons
      • Errata
    • 5. Models of point neuronal dynamic
      • Tutorial - models of point neuronal processes
        • The leaky integrate and fire model
        • The Izhikevich neuron model
        • The Hodgkin-Huxley neuron model
      • Synapse modeling and point neurons
      • Case study: a SNN for perceptual filling-in
      • Take away lessons
    • 6. Models of morphologically detailed neurons
      • Morphologically detailed modeling
      • The cable equation
      • The compartmental model
      • Case study: direction-selective SAC
      • Take away lessons
    • 7. Models of network dynamic and learning
      • Circuit taxonomy, reconstruction, and simulation
      • Case study: SACs' lateral inhibition in direction selectivity
      • Neuromorphic and biological learning
      • Take away lessons
      • Errate
  • III. Architect perspective
    • 8. Neuromorphic Hardware
      • Transistors and micro-power circuitry
      • The silicon neuron
      • Case study: hardware - software co-synthesis
      • Take away lessons
    • 9. Communication and hybrid circuit design
      • Neural architectures
      • Take away lessons
    • 10. In-memory computing with memristors
      • Memristive computing
      • Take away lessons
      • Errata
  • IV. Algorithm designer perspective
    • 11. Introduction to neuromorphic programming
      • Theory and neuromorphic programming
      • Take away lessons
    • 12. The neural engineering framework
      • NEF: Representation
      • NEF: Transformation
      • NEF: Dynamics
      • Case study: motion detection using oscillation interference
      • Take away lessons
      • Errate
    • 13. Learning spiking neural networks
      • Learning with SNN
      • Take away lessons
Powered by GitBook
On this page

Was this helpful?

  1. I. Introduction
  2. 3. Introducing the perspective of the algorithm designer

Take home lessons

PreviousNeuromorphic software developmentNext4. Biological description of neuronal dynamics

Last updated 3 years ago

Was this helpful?

Artificial neural network: Interconnected group of nodes, inspired by a mathematically abstracted model of a biological neural network.

Deep neural network: An artificial neural network wherein groups of neurons are organized in interconnected layers.

Convolutional neural network: A deep neural network that incorporates convolutional and pooling layers. Mostly used for training over matrices (or frames) for visual processing.

Recurrent neural network: A deep neural network that features feedback (or recursive) connections between neurons. Mostly used for training over temporal sequences.

Biological neural network: A neural network comprises biological neurons connected via electrical or chemical synapses.

Spiking neural network: In contrast to ANNs, which communicate differentiable values, SNNs propagate discrete spikes. A spiking neuron produces spikes in a response driven by spikes arriving from other neurons. Incoming spikes are filtered, weighted, summed, and evaluated according to some neuronal dynamics which dictates the threshold for the initiation of output spikes.

Spiking neurons can be organized in layers to construct deep neural networks, wherein each neuron in one layer is connected to each neuron in the successive layer. Each connection has an adjustable weight.