Gradient expectations : : structure, origins, and synthesis of predictive neural networks / / Keith L. Downing.

An insightful investigation into the mechanisms underlying the predictive functions of neural networks--and their ability to chart a new path for AI. Prediction is a cognitive advantage like few others, inherently linked to our ability to survive and thrive. Our brains are awash in signals that embo...

Full description

Saved in:
Bibliographic Details
:
Place / Publishing House:Cambridge, MA : : The MIT Press,, 2023
Year of Publication:2023
Edition:1st ed.
Language:English
Series:The MIT Press
Physical Description:1 online resource (280 pages).
Tags: Add Tag
No Tags, Be the first to tag this record!
Table of Contents:
  • Intro
  • Title Page
  • Copyright Page
  • Dedication
  • Table of Contents
  • Preface
  • Acknowledgments
  • 1. Introduction
  • 1.1. Data from Predictions
  • 1.2. Movement and Prediction
  • 1.3. Adaptation and Emergence
  • 1.3.1. Gradients and Emergence in Neural Networks
  • 1.4. Overflowing Expectations
  • 2. Conceptual Foundations of Prediction
  • 2.1. Compare and Err
  • 2.2. Guesses and Goals
  • 2.3. Gradients
  • 2.3.1. Gradients Rising
  • 2.4. Sequences
  • 2.5. Abstracting by Averaging
  • 2.6. Control and Prediction
  • 2.7. Predictive Coding
  • 2.8. Tracking Marr's Tiers
  • 3. Biological Foundations of Prediction
  • 3.1. Gradient-Following Bacteria
  • 3.2. Neural Motifs for Gradient Calculation
  • 3.3. Birth of a PID Controller
  • 3.3.1. Adaptive Control in the Cerebellum
  • 3.4. Detectors and Generators
  • 3.4.1. The Hippocampus
  • 3.4.2. Conceptual Embedding in the Hippocampus
  • 3.5. Gradients of Predictions in the Basal Ganglia
  • 3.6. Procedural versus Declarative Prediction
  • 3.7. Rampant Expectations
  • 4. Neural Energy Networks
  • 4.1. Energetic Basis of Learning and Prediction
  • 4.2. Energy Landscapes and Gradients
  • 4.3. The Boltzmann Machine
  • 4.4. The Restricted Boltzmann Machine (RBM)
  • 4.5. Free Energy
  • 4.5.1. Variational Free Energy
  • 4.6. The Helmholtz Machine
  • 4.7. The Free Energy Principle
  • 4.8. Getting a Grip
  • 5. Predictive Coding
  • 5.1. Information Theory and Perception
  • 5.2. Predictive Coding on High
  • 5.2.1. Learning Proper Predictions
  • 5.3. Predictive Coding for Machine Learning
  • 5.3.1. The Backpropagation Algorithm
  • 5.3.2. Backpropagation via Predictive Coding
  • 5.4. In Theory
  • 6. Emergence of Predictive Networks
  • 6.1. Facilitated Variation
  • 6.2. Origins of Sensorimotor Activity
  • 6.2.1. Origins of Oscillations
  • 6.2.2. Activity Regulation in the Brain.
  • 6.2.3. Competition and Cooperation in Brain Development
  • 6.2.4. Layers and Modules
  • 6.2.5. Running through the Woods on an Icy Evening
  • 6.2.6. Oscillations and Learning
  • 6.3. A Brief Evolutionary History of the Predictive Brain
  • 7. Evolving Artificial Predictive Networks
  • 7.1. I'm a Doctor, Not a Connectionist
  • 7.2. Evolving Artificial Neural Networks (EANNs)
  • 7.2.1. Reconciling EANNs with Deep Learning
  • 7.3. Evolving Predictive Coding Networks
  • 7.3.1. Preserving Backpropagation in a Local Form
  • 7.3.2. Phylogenetic, Ontogenetic, and Epigenetic (POE)
  • 7.4. Continuous Time Recurrent Neural Networks (CTRNNs)
  • 7.4.1. Evolving Minimally Cognitive Agents
  • 7.4.2. Cognitive Robots Using Predictive Coding
  • 7.4.3. Toward More Emergent CTRNNs
  • 7.5. Predictive POE Networks
  • 7.5.1. Simulating Neural Selectionism and Constructivism
  • 7.5.2. Predictive Constructivism
  • 7.5.3. The D'Arcy Model
  • 7.5.4. Neurites to Neurons in D'Arcy
  • 7.5.5. Peripherals in D'Arcy
  • 7.5.6. Neuromodulators in D'Arcy
  • 7.5.7. Predictively Unpredictable
  • 7.6. Most Useful and Excellent Designs
  • 8. Conclusion
  • 8.1. Schrodinger's Frozen Duck
  • 8.2. Expectations Great and Small
  • 8.3. As Expected
  • 8.4. Gradient Expectations
  • 8.5. Expecting the Unexpected
  • References
  • Index.