Timezone: »

 
Poster
PAC-Bayes bounds for stable algorithms with instance-dependent priors
Omar Rivasplata · Emilio Parrado-Hernandez · John Shawe-Taylor · Shiliang Sun · Csaba Szepesvari

Wed Dec 05 07:45 AM -- 09:45 AM (PST) @ Room 210 #56

PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper the PAC-Bayes approach is combined with stability of the hypothesis learned by a Hilbert space valued algorithm. The PAC-Bayes setting is used with a Gaussian prior centered at the expected output. Thus a novelty of our paper is using priors defined in terms of the data-generating distribution. Our main result estimates the risk of the randomized algorithm in terms of the hypothesis stability coefficients. We also provide a new bound for the SVM classifier, which is compared to other known bounds experimentally. Ours appears to be the first uniform hypothesis stability-based bound that evaluates to non-trivial values.

Author Information

Omar Rivasplata (University College London)

My top-level areas of interest are statistical learning theory, machine learning, probability and statistics. These days I am very interested in deep learning and reinforcement learning. I am affiliated with the Institute for Mathematical and Statistical Sciences, University College London, hosted by the Department of Statistical Science as a Senior Research Fellow. Before my current post I was for a few months at UCL Department of Mathematics, and previously I was for a few years at UCL Department of Computer Science where I did research studies (machine learning) sponsored by DeepMind and in parallel with these studies I was a research scientist intern at DeepMind for three years. Back in the day I studied undergraduate maths (BSc 2000, Pontificia Universidad Católica del Perú) and graduate maths (MSc 2005, PhD 2012, University of Alberta). I've lived in Peru, in Canada, and now I'm based in the UK.

Emilio Parrado-Hernandez (University Carlos III de Madrid)
John Shawe-Taylor (UCL)

John Shawe-Taylor has contributed to fields ranging from graph theory through cryptography to statistical learning theory and its applications. However, his main contributions have been in the development of the analysis and subsequent algorithmic definition of principled machine learning algorithms founded in statistical learning theory. This work has helped to drive a fundamental rebirth in the field of machine learning with the introduction of kernel methods and support vector machines, driving the mapping of these approaches onto novel domains including work in computer vision, document classification, and applications in biology and medicine focussed on brain scan, immunity and proteome analysis. He has published over 300 papers and two books that have together attracted over 60000 citations. He has also been instrumental in assembling a series of influential European Networks of Excellence. The scientific coordination of these projects has influenced a generation of researchers and promoted the widespread uptake of machine learning in both science and industry that we are currently witnessing.

Shiliang Sun (East China Normal University)
Csaba Szepesvari (University of Alberta)

More from the Same Authors