Timezone: »
We investigate two novel mixed robust/average-case submodular data partitioning problems that we collectively call Submodular Partitioning. These problems generalize purely robust instances of the problem, namely max-min submodular fair allocation (SFA) and \emph{min-max submodular load balancing} (SLB), and also average-case instances, that is the submodular welfare problem (SWP) and submodular multiway partition (SMP). While the robust versions have been studied in the theory community, existing work has focused on tight approximation guarantees, and the resultant algorithms are not generally scalable to large real-world applications. This contrasts the average case instances, where most of the algorithms are scalable. In the present paper, we bridge this gap, by proposing several new algorithms (including greedy, majorization-minimization, minorization-maximization, and relaxation algorithms) that not only scale to large datasets but that also achieve theoretical approximation guarantees comparable to the state-of-the-art. We moreover provide new scalable algorithms that apply to additive combinations of the robust and average-case objectives. We show that these problems have many applications in machine learning (ML), including data partitioning and load balancing for distributed ML, data clustering, and image segmentation. We empirically demonstrate the efficacy of our algorithms on real-world problems involving data partitioning for distributed optimization (of convex and deep neural network objectives), and also purely unsupervised image segmentation.
Author Information
Kai Wei
Rishabh K Iyer (University of Washington, Seattle)
Shengjie Wang (University of Washington)
Wenruo Bai (University of Washington)
Jeffrey A Bilmes (University of Washington, Seattle)
Jeffrey A. Bilmes is a professor at the Department of Electrical and Computer Engineering at the University of Washington, Seattle Washington. He is also an adjunct professor in Computer Science & Engineering and the department of Linguistics. Prof. Bilmes is the founder of the MELODI (MachinE Learning for Optimization and Data Interpretation) lab here in the department. Bilmes received his Ph.D. from the Computer Science Division of the department of Electrical Engineering and Computer Science, University of California in Berkeley and a masters degree from MIT. He was also a researcher at the International Computer Science Institute, and a member of the Realization group there. Prof. Bilmes is a 2001 NSF Career award winner, a 2002 CRA Digital Government Fellow, a 2008 NAE Gilbreth Lectureship award recipient, and a 2012/2013 ISCA Distinguished Lecturer. Prof. Bilmes was, along with Andrew Ng, one of the two UAI (Conference on Uncertainty in Artificial Intelligence) program chairs (2009) and then the general chair (2010). He was also a workshop chair (2011) and the tutorials chair (2014) at NIPS/NeurIPS (Neural Information Processing Systems), and is a regular senior technical chair at NeurIPS/NIPS since then. He was an action editor for JMLR (Journal of Machine Learning Research). Prof. Bilmes's primary interests lie in statistical modeling (particularly graphical model approaches) and signal processing for pattern classification, speech recognition, language processing, bioinformatics, machine learning, submodularity in combinatorial optimization and machine learning, active and semi-supervised learning, and audio/music processing. He is particularly interested in temporal graphical models (or dynamic graphical models, which includes HMMs, DBNs, and CRFs) and ways in which to design efficient algorithms for them and design their structure so that they may perform as better structured classifiers. He also has strong interests in speech-based human-computer interfaces, the statistical properties of natural objects and natural scenes, information theory and its relation to natural computation by humans and pattern recognition by machines, and computational music processing (such as human timing subtleties). He is also quite interested in high performance computing systems, computer architecture, and software techniques to reduce power consumption. Prof. Bilmes has also pioneered (starting in 2003) the development of submodularity within machine learning, and he received a best paper award at ICML 2013, a best paper award at NIPS 2013, and a best paper award at ACMBCB in 2016, all in this area. In 2014, Prof. Bilmes also received a most influential paper in 25 years award from the International Conference on Supercomputing, given to a paper on high-performance matrix optimization. Prof. Bilmes has authored the graphical models toolkit (GMTK), a dynamic graphical-model based software system widely used in speech, language, bioinformatics, and human-activity recognition.
More from the Same Authors
-
2021 Spotlight: Constrained Robust Submodular Partitioning »
Shengjie Wang · Tianyi Zhou · Chandrashekhar Lavania · Jeff A Bilmes -
2022 Poster: Retrospective Adversarial Replay for Continual Learning »
Lilly Kumari · Shengjie Wang · Tianyi Zhou · Jeff A Bilmes -
2021 Poster: Constrained Robust Submodular Partitioning »
Shengjie Wang · Tianyi Zhou · Chandrashekhar Lavania · Jeff A Bilmes -
2020 Session: Orals & Spotlights Track 32: Optimization »
Hamed Hassani · Jeffrey A Bilmes -
2020 Poster: Curriculum Learning by Dynamic Instance Hardness »
Tianyi Zhou · Shengjie Wang · Jeffrey A Bilmes -
2019 : Jeffrey Bilmes »
Jeff A Bilmes -
2019 Poster: On Mixup Training: Improved Calibration and Predictive Uncertainty for Deep Neural Networks »
Sunil Thulasidasan · Gopinath Chennupati · Jeffrey A Bilmes · Tanmoy Bhattacharya · Sarah Michalak -
2018 Poster: Diverse Ensemble Evolution: Curriculum Data-Model Marriage »
Tianyi Zhou · Shengjie Wang · Jeffrey A Bilmes -
2018 Poster: Submodular Maximization via Gradient Ascent: The Case of Deep Submodular Functions »
Wenruo Bai · William Stafford Noble · Jeffrey A Bilmes -
2017 Workshop: Discrete Structures in Machine Learning »
Yaron Singer · Jeff A Bilmes · Andreas Krause · Stefanie Jegelka · Amin Karbasi -
2016 Poster: Deep Submodular Functions: Definitions and Learning »
Brian W Dolhansky · Jeffrey A Bilmes -
2015 Poster: Submodular Hamming Metrics »
Jennifer Gillenwater · Rishabh K Iyer · Bethany Lusch · Rahul Kidambi · Jeffrey A Bilmes -
2015 Spotlight: Submodular Hamming Metrics »
Jennifer Gillenwater · Rishabh K Iyer · Bethany Lusch · Rahul Kidambi · Jeffrey A Bilmes -
2014 Workshop: Discrete Optimization in Machine Learning »
Jeffrey A Bilmes · Andreas Krause · Stefanie Jegelka · S Thomas McCormick · Sebastian Nowozin · Yaron Singer · Dhruv Batra · Volkan Cevher -
2014 Poster: Divide-and-Conquer Learning by Anchoring a Conical Hull »
Tianyi Zhou · Jeffrey A Bilmes · Carlos Guestrin -
2014 Poster: Learning Mixtures of Submodular Functions for Image Collection Summarization »
Sebastian Tschiatschek · Rishabh K Iyer · Haochen Wei · Jeffrey A Bilmes -
2014 Session: Oral Session 1 »
Jeffrey A Bilmes -
2014 Session: Tutorial Session B »
Jeffrey A Bilmes -
2014 Session: Tutorial Session B »
Jeffrey A Bilmes -
2014 Session: Tutorial Session B »
Jeffrey A Bilmes -
2013 Workshop: Discrete Optimization in Machine Learning: Connecting Theory and Practice »
Stefanie Jegelka · Andreas Krause · Pradeep Ravikumar · Kazuo Murota · Jeffrey A Bilmes · Yisong Yue · Michael Jordan -
2013 Poster: Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints »
Rishabh K Iyer · Jeffrey A Bilmes -
2013 Oral: Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints »
Rishabh K Iyer · Jeffrey A Bilmes -
2013 Poster: Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions »
Rishabh K Iyer · Stefanie Jegelka · Jeffrey A Bilmes -
2013 Tutorial: Deep Mathematical Properties of Submodularity with Applications to Machine Learning »
Jeffrey A Bilmes -
2012 Workshop: Discrete Optimization in Machine Learning (DISCML): Structure and Scalability »
Stefanie Jegelka · Andreas Krause · Jeffrey A Bilmes · Pradeep Ravikumar -
2012 Poster: Submodular Bregman Divergences with Applications »
Rishabh K Iyer · Jeffrey A Bilmes -
2011 Workshop: Discrete Optimization in Machine Learning (DISCML): Uncertainty, Generalization and Feedback »
Andreas Krause · Pradeep Ravikumar · Stefanie S Jegelka · Jeffrey A Bilmes -
2011 Poster: Fast approximate submodular minimization »
Stefanie Jegelka · Hui Lin · Jeffrey A Bilmes -
2011 Poster: Online Submodular Set Cover, Ranking, and Repeated Active Learning »
Andrew Guillory · Jeffrey A Bilmes -
2011 Spotlight: Online Submodular Set Cover, Ranking, and Repeated Active Learning »
Andrew Guillory · Jeffrey A Bilmes -
2010 Workshop: Discrete Optimization in Machine Learning: Structures, Algorithms and Applications »
Andreas Krause · Pradeep Ravikumar · Jeffrey A Bilmes · Stefanie Jegelka -
2009 Workshop: Discrete Optimization in Machine Learning: Submodularity, Polyhedra and Sparsity »
Andreas Krause · Pradeep Ravikumar · Jeffrey A Bilmes -
2009 Poster: Submodularity Cuts and Applications »
Yoshinobu Kawahara · Kiyohito Nagano · Koji Tsuda · Jeffrey A Bilmes -
2009 Poster: Label Selection on Graphs »
Andrew Guillory · Jeffrey A Bilmes -
2009 Spotlight: Submodularity Cuts and Applications »
Yoshinobu Kawahara · Kiyohito Nagano · Koji Tsuda · Jeffrey A Bilmes -
2009 Poster: Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification »
Amarnag Subramanya · Jeffrey A Bilmes -
2009 Spotlight: Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification »
Amarnag Subramanya · Jeffrey A Bilmes -
2006 Demonstration: The Vocal Joystick »
James Landay · Richard Wright · Kelley Kilanski · Xiao Li · Jon Malkin · Jeffrey A Bilmes -
2006 Poster: Multi-dynamic Bayesian Networks »
Karim Filali · Jeffrey A Bilmes