EE 595 Independent Studies On

Statistical Learning

Fall 2005



[Home] [Lectures] [Homework] [Related Links]


Dr. Sen-ching Cheung (
Offices: 687B FPAT (7-9113)
Office hours: MWF,
9:30am-10:50pm (691 FPAT)


Course Description

Statistical learning theory is playing an increasingly important role in diverse areas such as communication, bioinformatics, computer vision, neuroscience, and economics. The goals of this course are twofold: first, provide a solid foundation on the fundamental concepts in statistical learning theory so that graduate students can apply them in their own research; second, discuss recent results and new applications in this area so as to stimulate new research ideas. The course will meet three times a week, discussing papers, book chapters, and homework problems. The meetings will tentatively be held MWF 4:30-5:30pm at MIA Lab in the Visualization Center. 

Tentative Topics

  1. Graphical model, sum-product and junction tree algorithms
  2. Linear and generalized linear models
  3. Exponential family, sufficiency, conjugacy
  4. Density estimation, kernel methods, mixture models
  5. Expectation-Maximization algorithm for parameter estimation
  6. Hidden Markov models (HMM)
  7. Factor analysis, principal component analysis (PCA),  canonical correlation analysis (CCA) and independent component analysis (ICA)
  8. Kalman filtering
  9. Approximate inferencing I: Markov-chain Monte-Carlo (MCMC) and particle filtering
  10. Approximate inferencing II: mean-field, loopy belief propagation
  11. Model selection, marginal likelihood, AIC, BIC and MDL
  12. Vapnik Chervonenkis theory and risk bounds
  13. Kernel methods and Support Vector Machine (SVM)
  14. Ensemble methods: bagging and boosting
  15. Nonparametric Bayes, Dirichlet processes
  16. Decision networks, Markov decision processes and reinforcement learning



Grades will be assigned based on participation (50%) and a substantial final project (50%). Titles and scopes of final projects will be jointly determined by the instructor and students. Each topic should involve substantial amount of LITERATURE SURVEY AND EXPERIMENTAL RESULTS. The prerequisite are basic statistics, probability and stochastic systems.


No text required. Copies of papers and book chapters will be provided. The following books are recommended:


  1. The Elements of Statistical Learning by T. Hastie et al.
  2. All of Statistics by L. Wasserman
  3. The Nature of Statistical Learning Theory by V. Vapnik.
  4. Learning with Kernels by B. Schlkopf and A. J. Smola
  5. Probabilistic Networks and Expert Systems by R. G. Cowell et al.
  6. Independent Component Analysis by A. Hyvarinen et al.
  7. Pattern Recognition by S. Theodoridis and K. Koutroumbas
  8. Artificial Intelligence: A Modern Approach by S. Russell and P. Norvig
  9. Machine Learning by T. Mitchell.


Sen-ching Samson Cheung
Last modified: Tue Aug 24 11:20:57 EDT 2004