Introduction to Neural Networks
U. of Minnesota
Mid-term Study Guide, Fall, 2001

The mid-term will cover material in Lectures 1-13
and the Anderson book, mainly chapters 1-9.
You are encouraged to research any chapters of the text or lecture notes
that may provide insight into the concepts and questions below.

Sample short answer questions
Define and describe the relation of the following key words or phrases to neural networks. Provide examples where appropriate.
(8 items drawn from set below; 3 points each).

eigenvector linear associator autoassociator synaptic modification
Hebbian heteroassociation coarse coding spontaneous activity
leaky integrator EPSP/IPSP projective field summed vector memory
dendrite classical conditioning receptive field lateral inhibition
spike linear independence grandmother cell perceptron
McCulloch-Pitts distinctive features cross-correlation supervised learning
recurrent inhibition pseudoinverse compartmental model symmetric matrix
WTA least mean squares linear system orthogonality
Widrow-Hoff error correction diameter-limited linear discriminant generative model
outer product learning perceptron learning rule topic vs. stress positions generic neural network neuron

Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points each).

Describe the anatomy of the generic neuron, and the slow-potential model.

Discuss a linear model of either auto- or hetero-associative learning. Give one example of its application.

Describe a neural network model for the lateral eye of the limulus. Discuss the relationship between the performance of feedforward and feedback models of lateral inhibition.

Discuss the pros and cons of distributed vs. localized representations with examples from theoretical considerations and neurophysiology.

Contrast "connectionist" computational schemes with traditional serial computing.

Describe physiological results supporting the biological basis of Hebbian synaptic modification.

 

Problem
(One problem, 3 to 6 points)

You should be able to compute inner and outer products, multiply matrices and vectors, calculate the transpose, know how to find eigenvectors and eigenvalues, measure the "similarity" between vectors, and find the inverse of small (e.g. 2x2) matrices.