Recent directions in nonparametric Bayesian machine learning

Zoubin Ghahramani is speaking tomorrow (Wed.):

Machine learning is an interdisciplinary field which seeks to develop both the mathematical foundations and practical applications of systems that learn, reason and act. Machine learning draws from many fields, ranging from Computer Science, to Engineering, Psychology, Neuroscience, and Statistics. Because uncertainty, data, and inference play a fundamental role in the design of systems that learn, statistical methods have recently emerged as one of the key components of the field of machine learning. In particular, Bayesian methods, based on the work of Reverend Thomas Bayes in the 1700s, describe how probabilities can be used to represent the degrees of belief of a rational agent. Bayesian methods work best when they are applied to models that are flexible enough to capture to complexity of real-world data. Recent work on non-parametric Bayesian methods provides this flexibility. I will touch upon key developments in the field, including Gaussian processes, Dirichlet processes, and the Indian buffet process (IBP). Focusing on the IBP, I will describe how this can be used in a number of applications such as collaborative filtering, bioinformatics, cognitive modelling, independent components analysis, and causal discovery. Finally, I will outline the main challenges in the field: how to develop new models, new fast inference algorithms, and compelling applications.

It’ll be 11am at 4th Floor CEPSR at Columbia. The talk looks great. Perhaps he’ll also explain how he can simultaneously teach at Cambridge, Pittsburgh, and London. Maybe Carnegie Mellon has an international campus?