Introduction to Neural Networks

Psy 5038, Daniel Kersten

Psychology Department, University of Minnesota

Spring 1999

Introduction to large scale parallel distributed processing models in neural and cognitive science. Topics include: linear models, Hebbian rules, self-organization, non-linear models, information optimization, and representation of neural information. Applications to sensory processing, perception, learning, and memory.

4 cr; The formal prerequisites are: Math 3261, Psy 3031 or 5061, or #.

The most important prerequisite is some background in linear algebra (vectors and matrices) and calculus. It is OK if your vectors and matrices are rusty, because we will review the basic facts of linear algebra that we need.

The class will meet in the Eddy Hall Annex Computer Lab facility, Room 62 rather than in the class room originally scheduled.

This year's material Spring 1999:

Syllabus 1999
Lecture notes Spring 1999
Programming assignments 1999

Last year's material - 1997-98:

Lecture notes Spring 1998
Programming assignments 1998

Some links:

Mathematica: Wolfram Research Home page (Mathematica) | Calculus & Mathematica
Neural network sites: David MacKay's home page | Gatsby Computational Neuroscience Unit
Neural networks frequently asked questions: NNs FAQ site
Vision-related sites: Illusion Works homepage | Dictionary of Vision terms: Visionary


Questions?
Contact:
Prof. Daniel Kersten, 211 Elliott Hall
Phone: 625-2589
Email: kersten@eye.psych.umn.edu


Kersten Lab | Vision Lab | Psychology Department | University of Minnesota

© 1998 Computational Vision Lab, University of Minnesota, Department of Psychology.