September, 2003

Introduction to Neural Networks
(Lecture Notes)

Psy 5038W, Fall 2003, 3 credits
Psychology Department , University of Minnesota

Place: 121 Elliott Hall (Computer Lab)
Time: 12:00-1:30 TTh


Course home page:

courses.kersten.org

Instructor: Daniel Kersten Office: 212 Elliott Hall Phone: 625-2589 email: kersten@umn.edu
Office hours: Thursday 1:30-2:30 or by appointment.

TA: Bruce Hartung Office: N13 Elliott--Must call: 625 1337 for access) email: hartung@cs.umn.edu
Office hours: Tuesday 1:15-2:15 or by appointment.

Course description. Introduction to large scale parallel distributed processing models in neural and cognitive science. Topics include: linear models, statistical pattern theory, Hebbian rules, self-organization, non-linear models, information optimization, and representation of neural information. Applications to sensory processing, perception, learning, and memory.

Readings

Grade Requirements

There will be a mid-term, final examination, programming assignments, as well as a final project. The grade weights are:

The programming assignments will use the Mathematica programming environment. No prior experience with Mathematica is necessary. List of Computer Labs at the University of Minnesota.

Assignment due BEFORE class start time (12:00 am) on the day due. You can use the downloaded Mathematica notebook for the assignment as your template, add your answers, and email your finished assignment to the TA. You can copy and paste any code bits you need from the Lecture notebooks. But of course, you cannot copy and paste code or any other answer materials from someone else.


Outline & Lecture Notes

(Revised lecture material will be posted on the day given--if you want a preview, check out lectures from 2001)


All lecture notes are in Mathematica Notebook and pdf format. You can download the Mathematica notebook files below to view with Mathematica or MathReader 4 (which is free).

 

Date

Lecture

Additional Readings & supplementary material

Assignments
due

I.

1

Sep 2

Introduction (pdf file)|Mathematica notebook

Mathematica intro.nb
Neuroscience tutorial (Clinical, Wash. U.)

Anderson: Intro. & Chapters 1, 2

 

2

Sep 4

The neuron (pdf file)| Mathematica notebook

Koch & Segev, 2000 (pdf)
Meunier & Segev, 2002 (pdf)

 

3

Sep 9

Neural Models, McCulloch-Pitt (pdf file)| Mathematica notebook

Jordan, M. I. and Bishop. C. MIT Artificial Intelligence Lab Memo 1562, March 1996. Neural networks.
Anderson: Chapters 3, 4

 

4

Sep 11

Generic neuron model (pdf file)| Mathematica notebook    

II.

5

Sep 16

Lateral inhibition (pdf file)| Mathematica notebook Anderson: Chapters 5, 6 & 7 PS 1. Introduction to Mathematica , vectors, cross-correlation
(pdf file)

6

Sep 18

Matrices (pdf file)| Mathematica notebook    

7

Sep 23

Learning & Memory (pdf file)| Mathematica notebook Anderson: Chapter 8  
III.

8

Sep 25

Linear Associator (pdf file)| Mathematica notebook

einstein32x32.jpg
gw_bush32x32.jpg
shannon32x32.jpg
einstein32x32missing.jpg

 

9

Sep 30

Sampling, Summed vector memory (pdf file)| Mathematica notebook  (See first part of Lecture 23 for review of probability and statistics). PS 2. Lateral inhibition -
(pdf file)

10

Oct 2

Non-linear networks, Perceptron (pdf file)| Mathematica notebook


 

 

11

Oct 7

Regression, Widrow-Hoff (pdf file)| Mathematica notebook Anderson: Chapter 9  

12

Oct 9

Multilayer feedforward nets, Backpropagation (pdf file)| Mathematica notebook

Backpropagation.m

 

 
IV.

13

Oct 14

Science writing (pdf)
(Mathematica notebook)

Gopen & Swan, 1990
Hopfield (1982)

 

PS 3. Perceptron
(pdf file)

14

Oct 16

MID-TERM

MID-TERM STUDY GUIDE

MID-TERM (16%)

15

Oct 21

Networks and Visual Representation (pdf file)| Mathematica notebook Anderson: Chapters 10, 11  

16

Oct 23

Neural Representation and coding (pdf file) Mathematica notebook    

17

Oct 28

Self-organization, Principal Components Analysis and NNs (pdf file)| Mathematica notebook Anderson: Chapter 12
(Supplement: ContingentAdaptation.nb)
 

18

Oct 30

Discrete Hopfield network (pdf file)| Mathematica notebook    

19

Nov 4

Graded response Hopfield network (pdf file)| Mathematica notebook    

20

Nov 6

Boltzmann machine (pdf file)| Mathematica notebook

  PS 4 Backprop, Hopfield network
(pdf file)

21

Nov 11

Sculpting the energy function, interpolation (pdf file)| Mathematica notebook) Anderson: Chapters 13, 14 Final project title & paragraph outline (2%)
22 Nov 13 Adaptive maps (pdf file)| Mathematica notebook Anderson: Chapters 15, 16
smallRetinaCortexMap.nb
GraylefteyeDan.jpg
 
V.
23 Nov 18 Probability
(pdf file)| Mathematica notebook
   
 24 Nov 20 Generative models,Bayes nets and inference
(pdf file)
Mathematica notebook
 
  Nov 25 Belief Propagation
(pdf)
Mathematica notebook
   
  Nov 27 THANKSGIVING    
26 Dec 2 EM
(pdf)
Mathematica notebook
  Complete Draft of Final Project (5%:)
27 Dec 4  Bias/Variance
(pdf)
Mathematica notebook
 Bias/Variance notes (pdf)  
  28 Dec 9

Wrap-up &
Review
(pdf)
Mathematica notebook

  (Drafts returned)
  Dec 11 FINAL EXAM FINAL STUDY GUIDE FINAL EXAM (16%)
  Dec 16   Final Revised Draft of Project (33%)

 


Final Project Assignment.

This course teaches you how to understand cognitive and perceptual aspects of brain processing in terms of computation. Writing a computer program encourages you to think clearly about the assumptions underlying a given theory. Getting a program to work, however, tests just one level of clear thinking. By writing about your work, you will learn to think through the broader implications of your final project, and to effectively communicate the rationale and results in words.

Your final project will involve: 1) a computer simulation and; 2) a 2000-3000 word final paper describing your simulation. For your computer project, you will do one of the following: 1) Devise a novel application for a neural network model studied in the course; 2) Write a program to simulate a model from the neural network literature ; 3) Design and program a method for solving some problem in perception, cognition or motor control. The results of your final project should be written up in the form of a short scientific paper, describing the motivation, methods, results, and interpretation. Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers. You may elect to have your final paper published in the course's web-based electronic journal.

Completing the final paper involves 3 steps:

  1. Outline. You will submit a working title and paragraph outline by the deadline noted in the syllabus. These outlines will be critiqued in order to help you find an appropriate focus for your papers. (2% of grade). (Consult with the instructor or TA for ideas well ahead of time).
  2. Complete draft. You will then submit a complete draft of your paper (2000-3000 words). Papers must include the following sections: Abstract, Introduction, Methods, Results, Discussion, and Bibliography. Use citations to motivate your problem and to justify your claims. Figures should be numbered and have figure captions. Cite authors by name and date, e.g. (Marr & Poggio, 1979). Use a standard citation format, such as APA. Papers must be typed, with a page number on each page.Each paper will be reviewed with specific recommendations for improvement. (5% of grade)
  3. Final draft. You will submit a final revision for grading. (33% of grade). The final draft must be turned in by the date noted on the syllabus. Students who wish to submit their final papers to be published in the class electronic journal should turn in both paper and electronic copies of their reports.

If you choose to write your program in Mathematica, your paper and program can be combined can be formated as a Mathematica notebook. See: Books and Tutorials on Notebooks.

Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers.

© 1998, 1999, 2001, 2003 Computational Vision Lab, University of Minnesota, Department of Psychology.