### Course Meeting Times

Lectures: 2 sessions / week, 1.5 hours / session

### Description

This is a graduate-level introduction to the principles of statistical inference with probabilistic models defined using graphical representations. The material in this course constitutes a common foundation for work in machine learning, signal processing, artificial intelligence, computer vision, control, and communication. Moreover, it is the companion course to 6.437 (Inference and Information) which is offered each Spring; 6.437 and 6.438 may be taken in either order.

It is worth stressing that *6.438 Algorithms for Inference* is an introductory graduate subject: it is not an advanced graduate subject for students who have already have a mastery of statistical inference algorithms, yet want to understand such material at an even more sophisticated level. That said, the structure of this subject will be somewhat different than other such introductions to the topic.

Ultimately, the subject is about teaching you contemporary approaches to, and perspectives on, problems of statistical inference. The development of the material that forms the basis for this subject has historically been very much driven by applications. However, our focus in the course will not be on these applications, which form the basis for entire courses of their own, but rather on the common problem solving frameworks that they share. Nevertheless, we will cite various relevant applications as we develop the material and sometimes extract simplified examples from these contexts.

### Prerequisites

The official prerequisites are *6.041 Probabilistic Systems Analysis and Applied Probability* or *6.436 Fundamentals of Probability*, and *18.06 Linear Algebra*, or their equivalents. Ultimately, what we require is fluency with both basic quantitative probabilistic analysis and linear algebra, together with some subsequent solid exposure to the engineering application of both.

### Reading

To fill in gaps or add additional insights, we will occasionally produce supplementary notes during the term. Any such notes will necessarily be spare, rough, and contain bugs, but will hopefully be a useful resource to you. Finally, you will also find sections of the following books to be useful and more in-depth auxiliary references for parts of the term.

- Bishop, Christopher M.
*Pattern Recognition and Machine Learning*. Springer, 2006. - Jordan, Michael Irwin.
*Introduction to Probabilistic Graphical Models*. (Lecture notes) - Koller, Daphne, and Nir Friedman.
*Probabilistic Graphical Models: Principles and Techniques*. MIT Press, 2009. ISBN: 9780262013192. [Preview with Google Books] - MacKay, David J. C.
*Information Theory, Inference, and Learning Algorithms*. Cambridge University Press, 2003. ISBN: 9780521642989. (Also available on-line at http://www.inference.phy.cam.ac.uk/mackay/itila/book.html)

### Problem Sets

There will be 10 problem sets. Problem sets will be due in lecture. Problem sets must be handed in by the end of the class in which they are due.

Problem set solutions will be available at the end of the due date’s lecture.

While you should do all the assigned problems, only a randomly chosen subset will actually be graded. Don’t be misled by the relatively few points assigned to homework grades in the final grade calculation. While the grade you get on your homework is only a minor component of your final grade, working through (and, yes, often struggling with at length!) the homework is a crucial part of the learning process and will invariably have a major impact on your understanding of the material. Some of the problem sets will involve a Matlab component, to help you explore different aspects of the material. Also, problems labeled \practice" are additional problems for you to work through as time permits and if you think they would be helpful to you. They are never graded, but solutions will be provided.

In undertaking the problem sets, moderate collaboration in the form of joint problem solving with one or two classmates is permitted provided your writeup is your own.

### Exams

There will be two mandatory quizzes scheduled in the evening.

The quizzes will be designed to require 1.5 hours of effort, but we will use the three-hour format to minimize the effects of time pressure. The quizzes will be closed book. You will be allowed to bring two 8.5x11-in sheets of notes (both sides) to the Midterm, and four 8.5x11-in sheets of notes to the Final quiz.

### Grading

ACTIVITIES | PERCENTAGES |
---|---|

Midterm Quiz | 40 |

Final Quiz | 45 |

Homework | 15 |

### Reference Texts

You may find sections of the following auxiliary texts to be useful references for parts of the term.

- Bertsekas, Dimitri, and John N. Tsitsiklis.
*Introduction to Probability*. Athena Scientific, 2002. ISBN: 9781886529403. - Bishop, Christopher M.
*Pattern Recognition and Machine Learning*. Springer, 2006. - Bremaud, Pierre.
*Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues*. Springer, 2008. [Preview with Google Books] - Cowell, Robert, Philip Dawid, et al.
*Probabilistic Networks and Expert Systems: Exact Computational Methods for Bayesian Networks*. Springer, 1999. ISBN: 9780387718231. [Preview with Google Books] - Doucet, Arnaud, Nando de Freitas, and Neil J. Gordon.
*Sequential Monte Carlo Methods in Practice*. Springer, 2001. ISBN: 9780387951461. - Jensen, Finn V, and Thomas D. Nielsen.
*Bayesian Networks and Decision Graphs*. Springer, 2007. ISBN: 9780387682815 [Preview with Google Books] - Kailath, Thomas.
*Linear Systems*. Prentice-Hall, 1980. ISBN: 9780135369616. - Kailath, Thomas, Ali H. Sayed, and Babak Hassibi.
*Linear Estimation*. Prentice-Hall, 2000. ISBN: 9780130224644. - Koller, Daphne, and Nir Friedman.
*Probabilistic Graphical Models: Principles and Techniques*. MIT Press, 2009. ISBN: 9780262013192. [Preview with Google Books] - Lauritzen, Steffan.
*Graphical Models*. Oxford University Press, 1996. ISBN: 9780198522195. - MacKay, David J. C.
*Information Theory, Inference, and Learning Algorithms*. Cambridge University Press, 2003. ISBN: 9780521642989. (Also available on-line at http://www.inference.phy.cam.ac.uk/mackay/itila/book.html) - Ristic, Branko, Sanjeev Arulampalam, and Neil Gordon.
*Beyond the Kalman Filter: Particle Filters for Tracking Applications*. Artech Print on Demand, 2004. ISBN: 9781580536318. - Strang, Gilbert.
*Linear Algebra and its Applications*. Harcourt College Publisher, 1988. ISBN: 9780030105685. - ———.
*Introduction to Linear Algebra*. Wellesley-Cambridge Press, 2003. ISBN: 9780961408893.

### Reference Papers

You may also find the following papers to be useful references for different portions of the material.

- Arulampalam, M. Sanjeev, Simon Maskell, et al. “A Tutorial on Particle Filters for Online Nonlinear / Non-Gaussian Bayesian Tracking.”
*IEEE Transaction on Signal Processing*50, no. 2 (2002): 174–88. - G. David Forney, Jr. “The Viterbi Algorithm.”
*Proceedings of the IEEE*61, no. 3 (1973): 268–78. - Frey, Brendan J., and Nebojsa Jojic. “A Comparison of Algorithms for Inference and Learning in Probabilistic Graphical Models.”
*IEEE Transaction on Pattern Analysis and Machine Intelligence*27, no. 9 (2005): 1392–416. - Jaakkola, Tommi. “Tutorial on Variational Approximation Methods.” In
*Advanced Mean Field Methods: Theory and Practice*. Edited by M. Opper and D. Saad. A Bradford Book, 2001, pp. 129–60. ISBN: 9780262150545. [Preview with Google Books] - Jordan, Michael Irwin. “Graphical Models.”
*Statistical Science*19, no. 1 (2004): 140–55. - Kalman, R. E. “A New Approach to Linear Filtering and Prediction Problems.”
*Transaction on The American Society of Mechanical Engineering Journal of Basic Engineering*, no. 82 series D (1960): 35–45. - Kschischang, Frank, Brendan Frey, et al. “Factor Graphs and the Sum Product Algorithm.”
*IEEE Transaction on Information Theory*47, no. 2 (2001): 498–519. - Loeliger, Hans-Andrea. “An Introduction to Factor Graphs.”
*IEEE Signal Processing Magazine*21, no. 1 (2004): 28–41. - Loeliger, Hans-Andrea, Justin Dauwels, et al. “The Factor Graph Approach to Model-Based Signal Processing.”
*Proceedings of the IEEE*95, no. 6 (2007): 1295–322. - Rabiner, Lawrence R. “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition.”
*Proceedings of the IEEE*77, no. 2 (1989): 257–86. - Wainwright, Martin, and Michael I. Jordan. “Graphical Models, Exponential Families, and Variational Inference.” Technical Report 649, Department of Statistics, University of California, September 21, 2003.

Updated version: Wainwright, M. J., and M. I. Jordan. “Graphical Models, Exponential Families, and Variational Inference.”*Foundations and Trends in Machine Learning*1, no. 1–2 (2008): 1–305. - Yedidia, Jonathan, William T. Freeman, et al. “Understanding Belief Propagation and its Generalizations.” In
*Exploring Artificial Intelligence in the New Millennium*. Morgan Kaufmann, 2002. ISBN: 9781558608115. [Preview with Google Books] - Yedidia, Jonathan, William T. Freeman, et al. “Constructing Free-Energy Approximations and Generalized Belief Propagation Algorithms.”
*IEEE Transaction on Informtion Theory*51, no. 7 (2005): 2282–312.