9.520 | Spring 2006 | Graduate

Statistical Learning Theory and Applications

Readings

There is no textbook for this course. All the required information will be presented in the slides associated with each class. The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of suggested readings is also provided for each class separately.

Boucheron, S., O. Bousquet, and G. Lugosi. “Theory of Classification: A Survey of Recent Advances.” ESAIM: Probability and Statistics 9 (2005): 323-375. (PDF)

———. “Introduction to Statistical Learning Theory.” In Advanced Lectures on Machine Learning. Lecture Notes in Artificial Intelligence 3176. Edited by O. Bousquet, U. Von Luxburg, and G. Ratsch. Heidelberg, Germany: Springer, 2004, pp. 169-207. (PDF)

Bousquet, O. “New Approaches to Statistical Learning Theory.” Annals of the Institute of Statistical Mathematics 55, no. 2 (2003): 371-389. (PS)

Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

———. Statistical Learning Theory. New York, NY: John Wiley & Sons, 1998. ISBN: 0471030031.

Devroye, L., L. Gyorfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. New York, NY: Springer, 1997. ISBN: 0387946187.

Cristianini, N., and J. Shawe-Taylor. Introduction To Support Vector Machines. Cambridge, UK: Cambridge University Press, 2000. ISBN: 0521780195.

Evgeniou, T., and M. Pontil, and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000). (PDF)

Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002). (PS - 1.7 MB)

Poggio, T., and S. Smale. “The Mathematics of Learning: Dealing with Data.” Notices of the AMS 50, no. 5 (2003): 537-544. (PDF)

Poggio, T., R. Rifkin, S. Mukherjee, and P. Niyogi. “General Conditions for Predictivity in Learning Theory.” Nature 428 (2004): 419-422. (PDF) (See also Past Performance and Future Results). (PDF)

SES # TOPICS READINGS
1 The Course at a Glance  
2 The Learning Problem in Perspective Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 1-49.

Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 1-50.

Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

3 Reproducing Kernel Hilbert Spaces Aronszajn, N. “Theory of Reproducing Kernels.” Transactions of the American Mathematical Society 68, no. 3 (1950): 337-404.

Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 1-49.

Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 1-50.

Girosi, F. “An Equivalence Between Sparse Approximation and Support Vector Machines.” Neural Computation 10, no. 6 (1998): 1455-1480.

Wahba, G. “Spline Models for Observational Data Series in Applied Mathematics.” SIAM 59 (1990). Philadelphia, PA: CBMS-NSF Regional Conference Series in Applied Mathematics.

4 Regression and Least-Squares Classification Rifkin, R. “Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning.” Ph.D. Thesis, Massachusetts Institute of Technology, 2002.

Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 1-50.

Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

5 Support Vector Machines for Classification Rifkin, R. “Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning.” Ph.D. Thesis, Massachusetts Institute of Technology, 2002.

Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 1-50.

Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

6 Manifold Regularization Belkin, M., and P. Niyogi. “Semi-supervised Learning on Riemannian Manifolds.” Machine Learning 56, no. 1-3 (2004): 209-239.

Belkin, M., P. Niyogi, and V. Sindhwani. “On Manifold Regularization.” In Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics. Barbados: University College London, 2005.

7 Unsupervised Learning Techniques Hastie, T., R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. New York, NY: Springer-Verlag, 2001. ISBN: 0387952845.

Tenenbaum, J., Vin de Silva, and J. Langford. “A Global Geometric Framework for Nonlinear Dimensionality Reduction.” Science 290, no. 5500 (2000): 2319-2323.

Donoho, G. “Hessian Eigenmaps: Locally Linear Embedding Techniques for High-dimensional Data.” PNAS 100, no. 10 (2003): 5591-5596.

Belkin, M., and P. Niyogi. “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation.” Neural Computation 15, no. 6 (2003): 1373-1396.

8 Multiclass Rifkin, R., and A. Klautau. “In Defense of One-Vs-All Classification.” Journal of Machine Learning Research 5 (2004): 101-141.
9 Ranking Caruana, R., and T. Joachims, eds. Proceedings of NIPS 2002 Workshop: Beyond Classification and Regression: Learning Rankings, Preferences, Equality Predicates, and Other Structures. Cornell, NY: Cornell University, 2002.

Saul, L., Y. Weiss, and L. Bottou, eds. Proceedings of the NIPS 2005 Workshop: Learning to Rank. Whistler, BC: Cornell University, 2005.

10 Boosting and Bagging Freund, Y., and R. E. Schapire. “A Short Introduction to Boosting.” Journal of Japanese Society for Artificial Intelligence 14, no. 5 (1999): 771-780.
11 Computer Vision

Object Detection

 
12 Online Learning Duda, R., P. Hart, and D. Stork. Pattern Classification. New York, NY: John Wiley & Sons, 2001. ISBN: 9814126020.

Smale, S., and Y. Yao. “Online Learning Algorithms.” Foundations of Computational Mathematics 6, no. 2 (2006): 145-170.

13 Loose Ends

Project discussions

 
14 Generalization Bounds

Intro to Stability

Vapnik, V. Statistical Learning Theory. New York, NY: John Wiley & Sons, 1998. ISBN: 0471030031.

Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 1-49.

Bousquet, O., and A. Elisseeff. “Stability and Generalization.” Journal of Machine Learning Research 2 (2002): 499-527.

15 Stability of Tikhonov Regularization Bousquet, O., and A. Elisseeff. “Stability and Generalization.” Journal of Machine Learning Research 2 (2002): 499-527.
16 Uniform Convergence Over Function Classes Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 1-49.

Zhou, Ding-Xuan. “Capacity of Reproducing Kernel Spaces in Learning Theory.” IEEE Trans on Info Theory 49, no. 7 (2003): 1743-1752.

Vapnik, V., and A. Chervonenkis. “Necessary and Sufficient Conditions for the Uniform Convergence of the Means to Their Expectations.” Probability Theory and Applications 26 (1981): 532-553.

17 Uniform Convergence for Classification

VC-dimension

Alon, N., S. Ben-David, N. Cesa-Bianchi, and D. Haussler. “Scale-sensitive Dimensions, Uniform Convergence, and Learnability.” Journal of the ACM 44, no. 4 (1997): 615-631.

Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.

Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 1-49.

Vapnik, V., and A. Chervonenkis. “Necessary and Sufficient Conditions for the Uniform Convergence of the Means to Their Expectations.” Probability Theory and Applications 26 (1981): 532-553.

18 Neuroscience Serre, T. “Learning a Dictionary of Shape-components in Visual Cortex: Comparison with Neurons, Humans and Machines.” PhD Thesis, Massachusetts Institute of Technology, 2006.

Serre, T., L. Wolf, and T. Poggio. “Object Recognition with Features Inspired by Visual Cortex.” In Proceedings of 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005). San Diego, CA: IEEE Computer Society Press, 2005.

Serre, T., M. Kouh, C. Cadieu, U. Knoblich, G. Kreiman, and T. Poggio. “A Theory of Object Recognition: Computations and Circuits in the Feedforward Path of the Ventral Stream in Primate Visual Cortex.” CBCL Paper #259/AI Memo #2005-036. Cambridge, MA: Massachusetts Institute of Technology, 2005.

19 Symmetrization

Rademacher Averages

 
20 Fenchel Duality  
21 Speech / Audio  
22 Active Learning  
23 Morphable Models for Video  
24 Bioinformatics  
25 Project Presentations  
26 Project Presentations (cont.)  
  Math Camp 1: Functional Analysis  
  Math Camp 2: Probability Theory  

Course Info

Instructor
As Taught In
Spring 2006
Level