Week 1 (Aug 27, 2007):
Lecture 1:
Overview of Machine Learning
Math background
overview: Calculus,
Algebra,
Probabilities/Statistics
(look also at other links
from Kari Torkkola)
Reading Assignment: Ch. 1.1 -
1.5, 2.1 - 2.3 from the textbook
|
Week 2:
(Sep 10, 2007):
Lecture 2:
Supervised learning;
Standard accuracy measures;
Optimal predictors.
Homework 1 (due on Mon, Sep 17):
Problems 1.5, 1.6, 1.8, 2.4, 2.8, 2.24, 2.34 from the textbook; two
problems from lecture
2 notes
|
Week 3:
(Sep 17, 2007)
Lecture 3:
Equivalence between optimal regression and classification;
Extreme approaches to minimizing MSE (nearest neighbor algorithm and
linear regression),
Linear regression (solution, statistical results);
Nonlinear
regression (gradient descent optimization);
Homework
2 due on Mon, Sep 24. (download file hw2.m).
In addition, solve the 5 homework problems highlighted by blue font in
lecture 3
notes.
|
Week 4:
(Sep 24, 2007)
Lecture
4:
Logistic regression by minimizing MSE
Maximum Likelihood (ML) approach for unsupervised
learning (density estimation), regression, classification;
|
Week 5:
(Oct 1, 2007)
Lecture
5:
Feedforward Neural Network (NN) Architecture
Simple O(NW2)
Method for NN Training
O(NW) Method
for NN Training: Backpropagation
Homework
3 due on Mon, Oct 8
|
Week 6:
(Oct 8, 2007)
Lecture
6:
Machine
Learning Process
NN Overfitting
(# epochs, #hidden nodes)
Regularization/
Weight Decay for overfitting prevention
Bias-Variance
decomposition; Bagging
Learning Curve
Homework
4 due on Mon, Oct 15
|
Week 7:
(Oct 15, 2007)
Lecture
7:
Support Vector
Machines
|
Week 8:
(Oct 22, 2007)
Lecture 8:
Support Vector
Machines - continued
Midterm
Course project discussion:
Instructions
for Project Proposal (Proposal is due Oct 29, in class)
class
presentation instructions
Useful
Reading:
How
to give a bad presentation
Ian
Parbery, "How
to Present a Paper in Theoretical Computer Science: A Speaker's Guide
for Students"
Homework
5 due on Mon, Oct 29
|
Week 9:
(Oct 29, 2007)
Lecture 9:
Bayesian
Networks. Reading assignment: Section 8.1, 8.3, 8.28.2.1,
8.2.2, 8.4, 8.4.1. Consult the lecture notes about Bayesian Networks
prepared by Prof. Hauskrecht from U. Pitt: Bayesian
belief networks. Bayesian
belief networks II, Bayesian
belief networks. Inference and Learning, Bayesian
belief networks. Learning.
15-minute presentations:
DeCoste, D., Mazzoni, D., Fast
Query-Optimized Kernel Machine Classification Via Incremental
Approximate Nearest Support Vectors, Proceedings of the
Twentieth International Conference on Machine Learning, Washington, DC
(2003). (Presentation by Vuk Malbasa)
Lin, C.F., Wang, S.D., Fuzzy
Support Vector Machines, IEEE Transactions on Neural Networks,
Vol. 13, No. 2 (2002). (Presentation by Zhuang Wang)
|
Week 10:
(Nov 05, 2007)
Lecture 10:
Bayesian
Networks - continued.
15-minute presentations:
Nonparametric Density
Estimation (Chapter 2.5 from the textbook) (Presentation by
Riu Baring)
Mixture Density Networks (Chapter
5.6 from the textbook) (Presentation by Qiang Lou)
D. M. J. Tax and R. P. W. Duin. Support
Vector Data Description. Machine Learning, 54:4566, 2004.. (Presentation by
Mihajlo Grbovic)
Homework
6. Due on Nov 12, 2007. Download: data1.zip,
data2.zip
|
Week 11:
(Nov 12, 2007)
Lecture 11:
Continuous
Latent Variables. Reading Assignment: Sections 12.1, 12.3 from
the textbook.
15-minute presentations:
Mitra, C. Murthy and S. Pal, "A
Probabilistic Active Support Vector Learning Algorithm,"
IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.
26, no.3, pp.413 - 418, 2004. (Presentation by Siyuan Ren)
K. Q. Weinberger, F. Sha, and L. K. Saul. "Learning
a Kernel Matrix for Nonlinear Dimensionality Reduction."
In Proceedings of the Twenty First International Conference on Machine
Learning (ICML-04), pages 839-846, Banff, Canada, 2004. (Presentation by
Michael Baranthan)
|
Week 12:
(Nov 19, 2007)
Lecture 12:
Bayesian
Regression. Reading Assignment: Sections 1.2.3, 1.2.5, 1.2.6,
2.1.1, 2.2.1, 2.3.1-2.3.3, 2.3.6, 3.3, 3.5 from the textbook.
15-minute presentations:
G. Camps-Valls, L. Gomez-Chova, J. Calpe, E. Soria,
J. D. Martύn, L. Alonso, and J. Moreno, Robust
Support Vector Method for Hyperspectral Data Classification and
Knowledge Discovery, IEEE Transactionson Geoscience and
Remote Sensing 42, pp. 15301542, July 2004. (Presentation by Haidong
Shi)
Rennie J., Shih, L., Teevan, J., & Karger, D,
Tackling the Poor Assumptions of Naive Bayes
Text Classifiers, In Proceedings of the 20th International
Conference on Machine Learning, Washington D.C., 2003. (Presentation
by Liang Lan)
Boosting (Chapter 14.3
from the textbook) (Presentation by Xin Lin)
Homework
7. Due on Nov 26/29, 2007. Download: data1.zip,
data2.zip
|
Week 13:
(Nov 26, 2007)
Lecture 13:
Mixture Models
and EM. Reading Assignment: Sections 9.1 - 9.3.
15-minute presentations:
Z. Huang, H. Chen, C. J. Hsu, W.H. Chen, and S. Wu.
"Credit Rating Analysis with Support Vector
Machines and Neural Networks: a Market Comparative Study."
Decision Support Systems, 37:543-558, 2004. (Presented by James
Joseph)
Yu, L., Liu, H. (2003) Feature
Selection for High-Dimensional Data: a Fast Correlation-Based Filter
Solution. Proceedings of the International Conference on
Machine Learning, 856-863. (Presented by Jingting Zeng)
Decision Trees (Chapter
14.4. from the textbook. Presented by Ping Zhang)
|
Week 14:
(Dec 3, 2007)
Lecture 14:
Sequential Data.
Reading Assignment: Chapter 13.
15-minute presentations:
Linear Models for
Classification - Discriminant Functions (Section 4.1,
presentation by Gregory Johnson)
Gaussian Processes for
Regression (Section 6.4, 6.4.1-4, presentation by Li An)
Markov Random Fields
(Section 8.3, presentation by Vladan Radosavljevic)
|