Dr. Ryan Rifkin
Dr. Sayan Mukherjee
Prof. Tomaso Poggio
Support vector machines have proven to be very useful in classification networks. These SVMs are now being used by drivers for pedestrian avoidance. This is one of the first truly universal applications of this technology.
This course is for upper-level graduate students who are planning careers in computational neuroscience. The assignments focus on some of the functions needed to make problem-solving more efficient for computer systems. The project topics students can choose from are based on unsolved problems in the field today. By the conclusion of this course, students should be able to solve one or two of these problems, and should be able to frame an approach to the rest of them.
Focuses on the problem of supervised learning from the perspective of modern statistical learning theory starting with the theory of multivariate function approximation from sparse data. Develops basic tools such as Regularization including Support Vector Machines for regression and classification. Derives generalization bounds using both stability and VC theory. Discusses topics such as boosting and feature selection. Examines applications in several areas: computer vision, computer graphics, text classification and bioinformatics. Final projects and hands-on applications and exercises are planned, paralleling the rapidly increasing practical uses of the techniques described in the subject.
OCW has published multiple versions of this subject.
Rifkin, Ryan, Sayan Mukherjee, Tomaso Poggio, and Alex Rakhlin. 9.520 Statistical Learning Theory and Applications, Spring 2003. (MIT OpenCourseWare: Massachusetts Institute of Technology), http://ocw.mit.edu/courses/brain-and-cognitive-sciences/9-520-statistical-learning-theory-and-applications-spring-2003 (Accessed). License: Creative Commons BY-NC-SA