There is no textbook for this course. All the required information will be presented in the slides associated with each class. The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of suggested readings is also provided for each class separately.
Boucheron, S., O. Bousquet, and G. Lugosi. “Theory of Classification: A Survey of Recent Advances.” ESAIM: Probability and Statistics 9 (2005): 323375. (PDF)
———. “Introduction to Statistical Learning Theory.” In Advanced Lectures on Machine Learning. Lecture Notes in Artificial Intelligence 3176. Edited by O. Bousquet, U. Von Luxburg, and G. Ratsch. Heidelberg, Germany: Springer, 2004, pp. 169207. (PDF)
Bousquet, O. “New Approaches to Statistical Learning Theory.” Annals of the Institute of Statistical Mathematics 55, no. 2 (2003): 371389. (PS)
Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.
———. Statistical Learning Theory. New York, NY: John Wiley & Sons, 1998. ISBN: 0471030031.
Devroye, L., L. Gyorfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. New York, NY: Springer, 1997. ISBN: 0387946187.
Cristianini, N., and J. ShaweTaylor. Introduction To Support Vector Machines. Cambridge, UK: Cambridge University Press, 2000. ISBN: 0521780195.
Evgeniou, T., and M. Pontil, and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000). (PDF)
Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002). (PS  1.7 MB)
Poggio, T., and S. Smale. “The Mathematics of Learning: Dealing with Data.” Notices of the AMS 50, no. 5 (2003): 537544. (PDF)
Poggio, T., R. Rifkin, S. Mukherjee, and P. Niyogi. “General Conditions for Predictivity in Learning Theory.” Nature 428 (2004): 419422. (PDF) (See also Past Performance and Future Results). (PDF)
SES #  TOPICS  READINGS 

1  The Course at a Glance  
2  The Learning Problem in Perspective 
Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 149.
Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 150. Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800. 
3  Reproducing Kernel Hilbert Spaces 
Aronszajn, N. “Theory of Reproducing Kernels.” Transactions of the American Mathematical Society 68, no. 3 (1950): 337404.
Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 149. Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 150. Girosi, F. “An Equivalence Between Sparse Approximation and Support Vector Machines.” Neural Computation 10, no. 6 (1998): 14551480. Wahba, G. “Spline Models for Observational Data Series in Applied Mathematics.” SIAM 59 (1990). Philadelphia, PA: CBMSNSF Regional Conference Series in Applied Mathematics. 
4  Regression and LeastSquares Classification 
Rifkin, R. “Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning.” Ph.D. Thesis, Massachusetts Institute of Technology, 2002.
Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 150. Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800. 
5  Support Vector Machines for Classification 
Rifkin, R. “Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning.” Ph.D. Thesis, Massachusetts Institute of Technology, 2002.
Evgeniou, P., and T. Poggio. “Regularization Networks and Support Vector Machines.” Advances in Computational Mathematics 13, no. 1 (2000): 150. Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800. 
6  Manifold Regularization 
Belkin, M., and P. Niyogi. “Semisupervised Learning on Riemannian Manifolds.” Machine Learning 56, no. 13 (2004): 209239.
Belkin, M., P. Niyogi, and V. Sindhwani. “On Manifold Regularization.” In Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics. Barbados: University College London, 2005. 
7  Unsupervised Learning Techniques 
Hastie, T., R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. New York, NY: SpringerVerlag, 2001. ISBN: 0387952845.
Tenenbaum, J., Vin de Silva, and J. Langford. “A Global Geometric Framework for Nonlinear Dimensionality Reduction.” Science 290, no. 5500 (2000): 23192323. Donoho, G. “Hessian Eigenmaps: Locally Linear Embedding Techniques for Highdimensional Data.” PNAS 100, no. 10 (2003): 55915596. Belkin, M., and P. Niyogi. “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation.” Neural Computation 15, no. 6 (2003): 13731396. 
8  Multiclass  Rifkin, R., and A. Klautau. “In Defense of OneVsAll Classification.” Journal of Machine Learning Research 5 (2004): 101141. 
9  Ranking 
Caruana, R., and T. Joachims, eds. Proceedings of NIPS 2002 Workshop: Beyond Classification and Regression: Learning Rankings, Preferences, Equality Predicates, and Other Structures. Cornell, NY: Cornell University, 2002.
Saul, L., Y. Weiss, and L. Bottou, eds. Proceedings of the NIPS 2005 Workshop: Learning to Rank. Whistler, BC: Cornell University, 2005. 
10  Boosting and Bagging  Freund, Y., and R. E. Schapire. “A Short Introduction to Boosting.” Journal of Japanese Society for Artificial Intelligence 14, no. 5 (1999): 771780. 
11 
Computer Vision
Object Detection 

12  Online Learning 
Duda, R., P. Hart, and D. Stork. Pattern Classification. New York, NY: John Wiley & Sons, 2001. ISBN: 9814126020.
Smale, S., and Y. Yao. “Online Learning Algorithms.” Foundations of Computational Mathematics 6, no. 2 (2006): 145170. 
13 
Loose Ends
Project discussions 

14 
Generalization Bounds
Intro to Stability 
Vapnik, V. Statistical Learning Theory. New York, NY: John Wiley & Sons, 1998. ISBN: 0471030031.
Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 149. Bousquet, O., and A. Elisseeff. “Stability and Generalization.” Journal of Machine Learning Research 2 (2002): 499527. 
15  Stability of Tikhonov Regularization  Bousquet, O., and A. Elisseeff. “Stability and Generalization.” Journal of Machine Learning Research 2 (2002): 499527. 
16  Uniform Convergence Over Function Classes 
Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800.
Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 149. Zhou, DingXuan. “Capacity of Reproducing Kernel Spaces in Learning Theory.” IEEE Trans on Info Theory 49, no. 7 (2003): 17431752. Vapnik, V., and A. Chervonenkis. “Necessary and Sufficient Conditions for the Uniform Convergence of the Means to Their Expectations.” Probability Theory and Applications 26 (1981): 532553. 
17 
Uniform Convergence for Classification
VCdimension 
Alon, N., S. BenDavid, N. CesaBianchi, and D. Haussler. “Scalesensitive Dimensions, Uniform Convergence, and Learnability.” Journal of the ACM 44, no. 4 (1997): 615631.
Vapnik, V. The Nature of Statistical Learning Theory. New York, NY: John Wiley & Sons, 1995. ISBN: 0387987800. Cucker, F., and S. Smale. “On The Mathematical Foundations of Learning.” Bulletin of the American Mathematical Society 39, no. 1 (2002): 149. Vapnik, V., and A. Chervonenkis. “Necessary and Sufficient Conditions for the Uniform Convergence of the Means to Their Expectations.” Probability Theory and Applications 26 (1981): 532553. 
18  Neuroscience 
Serre, T. “Learning a Dictionary of Shapecomponents in Visual Cortex: Comparison with Neurons, Humans and Machines.” PhD Thesis, Massachusetts Institute of Technology, 2006.
Serre, T., L. Wolf, and T. Poggio. “Object Recognition with Features Inspired by Visual Cortex.” In Proceedings of 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005). San Diego, CA: IEEE Computer Society Press, 2005. Serre, T., M. Kouh, C. Cadieu, U. Knoblich, G. Kreiman, and T. Poggio. “A Theory of Object Recognition: Computations and Circuits in the Feedforward Path of the Ventral Stream in Primate Visual Cortex.” CBCL Paper #259/AI Memo #2005036. Cambridge, MA: Massachusetts Institute of Technology, 2005. 
19 
Symmetrization
Rademacher Averages 

20  Fenchel Duality  
21  Speech / Audio  
22  Active Learning  
23  Morphable Models for Video  
24  Bioinformatics  
25  Project Presentations  
26  Project Presentations (cont.)  
Math Camp 1: Functional Analysis  
Math Camp 2: Probability Theory 