Course Meeting Times
Lectures: 1 session / week, 2 hours / session
Permission of instructor.
- To understand the methods by which we can represent, propagate and infer uncertainty.
- To explore "interdisciplinary" application.
- To build an intellectual community around uncertainty quantification.
The specific topics vary between offerings, but are generally drawn from the following:
- Density Estimation: Exponential Family, Mixture Models, Kernels, Markov Chain Monte Carlo
- Model Selection: Jacknife, Bootstrap, Cross-validation and Information Criteria
- Dimensionality Reduction: PCA, ICA, and other nonlinear modes
- Model Reduction: POD / EOF, Krylov, Response Surface Models, Polynomial Chaos
- Inference: Hierarchical Bayes, Graphical Models
- Time-dependent Inference: Linear, Ensemble, Mixture, Kernel, Mutual Information, and Particle Filtering and Smoothing
- Statistical Models: Regression Machines, Gaussian Processes, Markov Models
- Manifold Learning
- Information Theoretic Estimation, Control and Learning
- Attend class.
- Read the assigned papers/readings.
- Every participant "recites" a paper.
- Do a project.
- Use a method studied here in your application.
- Develop a new method for applications studied here.
- Write a review paper in a subject area.
Gelb, Arthur. Applied Optimal Estimation. MIT Press, 1974. ISBN: 9780262570480.
Martinez, W. L., and A. R. Martinez. Computational Statistics Handbook with MATLAB. 2nd ed. Chapman and Hall/CRC, 2007. ISBN: 978158488566. [Preview with Google Books]
The course grade is based upon paper explanation, project, and class participation.