Timezone: »

Dynamical Systems, Stochastic Processes and Bayesian Inference
Manfred Opper · Cedric Archambeau · John Shawe-Taylor

Sat Dec 09 12:00 AM -- 12:00 AM (PST) @ Mt. Currie South
Event URL: http://www.cs.ucl.ac.uk/staff/c.archambeau/dsb.htm »

The modelling of continuous-time dynamical systems from uncertain observations is an important task that comes up in a wide range of applications ranging from numerical weather prediction over finance to genetic networks and motion capture in video. Often, we may assume that the dynamical models are formulated by systems of differential equations. In a Bayesian approach, we may then incorporate a priori knowledge about the dynamics by providing probability distributions over the unknown functions, which correspond for example to driving forces and appear as coefficients or parameters in the differential equations. Hence, such functions become stochastic processes in a probabilistic Bayesian framework. Gaussian processes (GPs) provide a natural and flexible framework in such circumstances. The use of GPs in the learning of functions from data is now a well-established technique in Machine Learning. Nevertheless, their application to dynamical systems becomes highly nontrivial when the dynamics is nonlinear in the (Gaussian) parameter functions as closed form analytical posterior predictions (even in the case of Gaussian observation noise) are no longer possible. Moreover, their computation requires the entire underlying Gaussian latent process at all times (not just at the discrete observation times). Hence, inference of the dynamics would require nontrivial sampling methods or approximation techniques. The aim of this workshop is to provide a forum for discussing open problems related to stochastic dynamical systems, their links to Bayesian inference and their relevance to Machine Learning.

Author Information

Manfred Opper (Technische Universitaet Berlin)
Cedric Archambeau (Amazon Web Services)
John Shawe-Taylor (UCL)

John Shawe-Taylor has contributed to fields ranging from graph theory through cryptography to statistical learning theory and its applications. However, his main contributions have been in the development of the analysis and subsequent algorithmic definition of principled machine learning algorithms founded in statistical learning theory. This work has helped to drive a fundamental rebirth in the field of machine learning with the introduction of kernel methods and support vector machines, driving the mapping of these approaches onto novel domains including work in computer vision, document classification, and applications in biology and medicine focussed on brain scan, immunity and proteome analysis. He has published over 300 papers and two books that have together attracted over 60000 citations. He has also been instrumental in assembling a series of influential European Networks of Excellence. The scientific coordination of these projects has influenced a generation of researchers and promoted the widespread uptake of machine learning in both science and industry that we are currently witnessing.

More from the Same Authors