Neuromorphic Engineering Book
  • Welcome
  • Preliminaries
    • About the author
    • Preface
    • A tale about passion and fear
    • Before we begin
  • I. Introduction
    • 1. Introducing the perspective of the scientist
      • From the neuron doctrine to emergent behavior
      • Brain modeling
      • Take away lessons
    • 2. Introducing the perspective of the computer architect
      • Limits of integrated circuits
      • Emerging computing paradigms
      • Brain-inspired hardware
      • Take away lessons
      • Errata
    • 3. Introducing the perspective of the algorithm designer
      • From artificial to spiking neural networks
      • Neuromorphic software development
      • Take home lessons
  • II. Scientist perspective
    • 4. Biological description of neuronal dynamics
      • Potentials, spikes and power estimation
      • Take away lessons
      • Errata
    • 5. Models of point neuronal dynamic
      • Tutorial - models of point neuronal processes
        • The leaky integrate and fire model
        • The Izhikevich neuron model
        • The Hodgkin-Huxley neuron model
      • Synapse modeling and point neurons
      • Case study: a SNN for perceptual filling-in
      • Take away lessons
    • 6. Models of morphologically detailed neurons
      • Morphologically detailed modeling
      • The cable equation
      • The compartmental model
      • Case study: direction-selective SAC
      • Take away lessons
    • 7. Models of network dynamic and learning
      • Circuit taxonomy, reconstruction, and simulation
      • Case study: SACs' lateral inhibition in direction selectivity
      • Neuromorphic and biological learning
      • Take away lessons
      • Errate
  • III. Architect perspective
    • 8. Neuromorphic Hardware
      • Transistors and micro-power circuitry
      • The silicon neuron
      • Case study: hardware - software co-synthesis
      • Take away lessons
    • 9. Communication and hybrid circuit design
      • Neural architectures
      • Take away lessons
    • 10. In-memory computing with memristors
      • Memristive computing
      • Take away lessons
      • Errata
  • IV. Algorithm designer perspective
    • 11. Introduction to neuromorphic programming
      • Theory and neuromorphic programming
      • Take away lessons
    • 12. The neural engineering framework
      • NEF: Representation
      • NEF: Transformation
      • NEF: Dynamics
      • Case study: motion detection using oscillation interference
      • Take away lessons
      • Errate
    • 13. Learning spiking neural networks
      • Learning with SNN
      • Take away lessons
Powered by GitBook
On this page

Was this helpful?

  1. II. Scientist perspective

7. Models of network dynamic and learning

Neurons communicate with each other with vastly distributed synaptic connections. Each neuron might be connected to 1,000 other neurons, creating an amazingly complex dynamic. In Chapter 1.1, one of the most important paradigm shifts in neuroscience was described: the advancement from the neuron doctrine to neural networks. In neural networks, a function emerges from the joint activation patterns of the interconnected neurons. These neuron ensembles can generate emergent functional states which cannot be observed by studying the single entities they comprise. Importantly, as was described in Section 1.2.2, morpho- logically higher-order structures imposed by the morphological diversity within neuronal types impact emergent network activity. This chapter will demonstrate how detailed neural networks are described and simulated and how networks are utilized for learning.

PreviousTake away lessonsNextCircuit taxonomy, reconstruction, and simulation

Last updated 3 years ago

Was this helpful?