Skip to yearly menu bar Skip to main content


Poster

PAC-Bayes Learning Bounds for Sample-Dependent Priors

Pranjal Awasthi · Satyen Kale · Stefani Karp · Mehryar Mohri

Poster Session 1 #436

Abstract:

We present a series of new PAC-Bayes learning guarantees for randomized algorithms with sample-dependent priors. Our most general bounds make no assumption on the priors and are given in terms of certain covering numbers under the infinite-Renyi divergence and the L1 distance. We show how to use these general bounds to derive leaning bounds in the setting where the sample-dependent priors obey an infinite-Renyi divergence or L1-distance sensitivity condition. We also provide a flexible framework for computing PAC-Bayes bounds, under certain stability assumptions on the sample-dependent priors, and show how to use this framework to give more refined bounds when the priors satisfy an infinite-Renyi divergence sensitivity condition.

Chat is not available.