|
This
course is intended for beginning
graduate students and advanced undergraduates. We assume students have
a rudimentary understanding of linear algebra, calculus, and are able
to program in some type of structured language. There will be four
homework assignments and a final project. Grading will be
approximately 60% on the homework assignments (15% each) and
40% on the final project. Secondary: 5-187 EE/CS
Building
Meetings:
Tuesdays and Thursdays , 9:45am-11am, Professor:
Paul Schrater E-mail:
schrater@umn.edu
Office: Primary:
211 Elliott Hall, Office
hours: 11am-12pm Tues, or by appointment
Teaching Assistants:
Daa-hey Woo dwoo@cs.umn.edu
Sam Zhou zhou@cs.umn.edu
Office Hours:
Sam: 11am-1pm Wed, EE/CS 2-209
Final Project Assignment: Your final project will involve one of the following
1) Simulation or experiments. For example, implement a pattern recognition system for a particular application, e.g. digit classification, document clustering, etc.
2) Literature survey (with critical evaluation) on a given topic.
3) Theoretical work (detailed derivations, extensions of existing work, etc)
In all cases, the work should be written up as a 12-15 page paper (single space). More difficult projects will get better grades if sucessfully completed. You may work in groups of 2-4. However, the content must be sufficient for the size of the group. You will be evaluated in terms of the care with which you set up and thought through the goals and implementation, and in terms of the competence of the execution. Regardless of form the write up should include a survey of related literature results.
The project schedule is:
Sept. 28: Topic selection. One or two pages explaining the project with
a list of references.
Nov. 7: Partial report (3 to 5 pages).
Dec 17: Final report (10 to 15 pages).
Example Projects (from previous classes):
Cheating and Plagiarism
The homework and programming assignments
must not be the result of cooperative work. Each student
must work individually in order to understand the material in depth.
You may discuss the issues but by
no means, copy the homework or the programming assignment of somebody
else. All work in the projects
and the programming assignment must properly cite sources. For example,
if you quote a source in your
project, you must include the quotation in quotation marks and clearly
indicate the source of the quotation.
Any student caught cheating will receive an F as a
class grade
and the University policies for cheating
and plagiarism will be followed.
Primary:
Pattern Recognition and Machine Learning, Christopher M. Bishop, Springer 2006.
Secondary
(select
chapters from)
Pattern
Classification, 2nd Ed. Duda,
Hart & Stork,Wiley, 2002.
Elements
of Statistical Learning: Data Mining, Inference, Prediction, Trevor Hastie, Robert Tibshirani, and Jerome
Friedman. Springer, 2001.
Gaussian
Processes for Machine Learning, Carl Rasmussen and Christopher K.I.
Williams, MIT Press, 2006.
Week | Tuesday | Thursday | Suggested Readings | Lecture Notes | Assignment |
1 (9/8-9/10) | Introduction to course | Probability theory and Lin algebra review | Course Syllabus, Matlab Tutorials(1,2), | Lec1 Lec2 | |
2 (9/15-9/17) |
Overview of Pattern Recognition | Overview of Pattern Recognition | PRML: Chap 1 | Lec3&4 | |
3 (9/22-9/24) | No class | No class | |||
4 (9/29-10/1) | CLASSIFICATION Linear Classifiers |
Linear Classifiers | PRML: Chap 4 |
Lec5 Lec6 | |
5 (10/6-10/8) | Neural Networks | Neural Networks | PRML: Chap 5
DudaHartStork:
Chap 4 | Lec7 | |
6 (10/13-10/15) | Patt Rec
example: Handwritten digit classification Matab for Pattern Rec. | Non-linear Disc Analysis: Support vector machines | DudaHartStork: Chap 5 PRML: Chap 6.1-6.3 | LogisticReg.pdf | Homework 2 bullseye.mat GaussianRegionsDisplayUtility.m Due Date 10/29 |
7 (10/20-10/22) | SVM | SVM | PRML: Chap 7 Burgestutorial | Lec12PattRec09.pdf kernels.pdf |
|
8 (10/27-10/29) | RVM | PRML: Chap 7 | |||
9 (11/3-11/5) | LEARNING FUNCTIONAL MAPPINGS Regression and Basis functions | Regression and Basis functions | PRML: Chap 3 Hastie et al. Chap 3, Chap 5 | RegressionPartI.pdf RegressionII.pdf | |
10 (11/10-11/12) | SVM regression | SVM
regression | SV regression tutorial | BayesRegress.pdf RegressionIII.pdf | HW3.zip svm.zip |
11 (11/17-11/19) | Bayesian Regression | Gaussian Processes | GP Tutorial Link to GP book | gpnt06.pdf | |
12 (11/24-11/26) | Gaussian Processes | Graphical Models | PRML: Chap 8 | ||
13 (12/1-12/3) | Graphical Models | Thanksgiving | PRML: Chap 8 | Lec17PattRec09.pdf | HW4.pdf |
14 (12/8-12/10) | Mixture Models | Dimensionality Reduction | PRML: Chap 9 PRML: Chap 12 | prml-slides-8.pdf | |
15 (12/15-12/17) | Sampling Methods | PRML: Chap 11 | Sampling.pdf | FINAL PROJECT due Dec. 17th |