Introduction to Neural Networks
U. of Minnesota, Final Study Guide
Psy 5038

Fall, 2003

There may be some material from the first half of the quarter, but the final exam
will be heavily weighted towards the second half of the quarter.

Sample short answer questions
Define and describe the relation of the following key words or phrases to neural networks. Provide examples where appropriate.(Answer 8 out of 12 items drawn from set below; 3 points each).

"Energy" attractor pseudoinverse bias/variance dilemma
autoassociator topographic representation grandmother cell asynchronous update
Content addressable memory Oja's rule principal components analysis sparse distributed representation
constraint satisfaction nearest-neighbor classifier "explaining away" correspondence problem
gradient descent Lyapanov function encoder network topology-preserving map (Kohonen)
simulated annealing cortical maps generalized delta rule Bayes net & probability factorization
XOR Hopfield's continuous response model

Gibbs G measure

(Kullback-Leibler distance)

anti-Hebbian
spontaneous activity projective field receptive field coarse coding
marginalization & conditioning radial basis function prototype/exemplar  local minimum



Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points each).

What is the Perceptron model? Discuss both its successes, failures and impact on the field of neural network research.

Discuss the pros and cons of distributed vs. localized representations with examples from theoretical considerations and neurophysiology.

Give an account of Hopfield's 1984 graded response neural network model. How can it be used for memory? How does it relate to the discrete stochastic model of 1982?

Describe how Hopfield's 1982 neural network can be set up to solve a constraint satisfaction problem. Use an example, such as Marr and Poggio's 1976 formulation of the stereo problem.

How does the error back-propagation model work (you don't need to derive the learning rule)? What are the pros and cons of this learning algorithm?

Describe the Boltzmann machine algorithm for both recall using annealing, and for learning (you need not derive the learning rule). What are the pros and cons of this learning algorithm?

Give an account of just one of the following approaches to self-organization: Kohonen, 1982; or principal components sub-space extraction.

Discuss the problem of data representation in biological systems using as an example a known sensory or motor map (e.g. a tonotopic or topographic map).

What is a mixture model? How can EM be used to estimate the parameters of the model?