Modern Monte Carlo methods I: Importance sampling and sequential Monte Carlo

Tom Griffiths
University of California, Berkeley (UC Berkeley)

One of the best ways to understand the structure of a probability distribution is to generate samples from it. However, there are only a few families of distributions from which it is easy to generate samples. I will describe several modern Monte Carlo methods, including importance sampling and sequential Monte Carlo, that can be used to generate samples from a wide range of distributions, with applications to Bayesian inference. I will also discuss how these methods relate to questions from cognitive science, such as how people might be able to perform some of the challenging computations involved in probabilistic inference.


Presentation (PowerPoint File)

Back to Graduate Summer School: Probabilistic Models of Cognition: The Mathematics of Mind