Skip to yearly menu bar Skip to main content


Spotlight Poster

Distilling Costly Set Functions to Parametric Families

Gantavya Bhatt · Arnav Das · Jeff A Bilmes

East Exhibit Hall A-C #1900
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Submodular functions, crucial for various applications, often lack practical learning methods for their acquisition. Seemingly unrelated, learning a scaling from oracles offering graded pairwise preferences (GPC) is underexplored, despite a rich history in psychometrics. In this paper, we introduce deep submodular peripteral networks (DSPNs), a novel parametric family of submodular functions, and methods for their training using a GPC-ready strategy to connect and then tackle both of the above challenges. We introduce newly devised GPC-style ``peripteral'' loss which leverages numerically graded relationships between pairs of objects (sets in our case). Unlike contrastive learning, our method utilizes graded comparisons, extracting more nuanced information than just binary-outcome comparisons, and contrasts sets of any size (not just two). We also define a novel suite of automatic sampling strategies for training, including active-learning inspired submodular feedback. We demonstrate DSPNs' efficacy in learning submodularity from a costly target submodular function showing superiority both for experimental design and online streaming applications.

Live content is unavailable. Log in and register to view live content