9.641J | Spring 2005 | Graduate

Introduction to Neural Networks

Syllabus

Course Meeting Times

Lectures: 2 sessions / week, 1.5 hours / sessions

Course Philosophy

The subject will focus on basic mathematical concepts for understanding nonlinearity and feedback in neural networks, with examples drawn from both neurobiology and computer science. Most of the subject is devoted to recurrent networks, because recurrent feedback loops dominate the synaptic connectivity of the brain. There will be some discussion of statistical pattern recognition, but less than in the past, because this perspective is now covered in Machine Learning and Neural Networks. Instead the connections to dynamical systems theory will be emphasized.

Modern research in theoretical neuroscience can be divided into three categories: cellular biophysics, network dynamics, and statistical analysis of neurobiological data. This subject is about the dynamics of networks, but excludes the biophysics of single neurons, which will be taught in 9.29J, Introduction to Computational Neuroscience.

Prerequisites

  • Permission of the instructor
  • Familiarity with linear algebra, multivariate calculus, and probability theory
  • Knowledge of a programming language (MATLAB® recommended)

Course Requirements

  • Problem sets
  • Midterm exam
  • Final exam

Textbook

The following text is recommended:

Hertz, John, Anders Krogh, and Richard G. Palmer. Introduction to the Theory of Neural Computation. Redwood City, CA: Addison-Wesley Pub. Co., 1991. ISBN: 9780201515602.

Course Info

As Taught In
Spring 2005
Level
Learning Resource Types
Lecture Notes
Problem Sets