Timezone: »
Although optimization is the longstanding, algorithmic backbone of machine learning new models still require the time-consuming implementation of new solvers. As a result, there are thousands of implementations of optimization algorithms for machine learning problems. A natural question is, if it is always necessary to implement a new solver, or is there one algorithm that is sufficient for most models. Common belief suggests that such a one-algorithm-fits-all approach cannot work, because this algorithm cannot exploit model specific structure. At least, a generic algorithm cannot be efficient and robust on a wide variety of problems. Here, we challenge this common belief. We have designed and implemented the optimization framework GENO (GENeric Optimization) that combines a modeling language with a generic solver. GENO takes the declaration of an optimization problem and generates a solver for the specified problem class. The framework is flexible enough to encompass most of the classical machine learning problems. We show on a wide variety of classical but also some recently suggested problems that the automatically generated solvers are (1) as efficient as well engineered, specialized solvers, (2) more efficient by a decent margin than recent state-of-the-art solvers, and (3) orders of magnitude more efficient than classical modeling language plus solver approaches.
Author Information
Sören Laue (Friedrich Schiller University Jena / Data Assessment Solutions)
Matthias Mitterreiter (Friedrich Schiller University Jena)
Joachim Giesen (Friedrich-Schiller-Universitat Jena)
More from the Same Authors
-
2022 Poster: Convexity Certificates from Hessians »
Julien Klaus · Niklas Merk · Konstantin Wiedom · Sören Laue · Joachim Giesen -
2019 : Posters and Coffee »
Sameer Kumar · Tomasz Kornuta · Oleg Bakhteev · Hui Guan · Xiaomeng Dong · Minsik Cho · Sören Laue · Theodoros Vasiloudis · Andreea Anghel · Erik Wijmans · Zeyuan Shang · Oleksii Kuchaiev · Ji Lin · Susan Zhang · Ligeng Zhu · Beidi Chen · Vinu Joseph · Jialin Ding · Jonathan Raiman · Ahnjae Shin · Vithursan Thangarasa · Anush Sankaran · Akhil Mathur · Martino Dazzi · Markus Löning · Darryl Ho · Emanuel Zgraggen · Supun Nakandala · Tomasz Kornuta · Rita Kuznetsova -
2019 Demonstration: GENO -- Optimization for Classical Machine Learning Made Fast and Easy »
Sören Laue · Matthias Mitterreiter · Joachim Giesen -
2018 Poster: Computing Higher Order Derivatives of Matrix and Tensor Expressions »
Sören Laue · Matthias Mitterreiter · Joachim Giesen -
2017 Demonstration: Matrix Calculus -- The Power of Symbolic Differentiation »
Sören Laue · Matthias Mitterreiter · Joachim Giesen -
2012 Poster: Approximating Concavely Parameterized Optimization Problems »
Joachim Giesen · Jens K Mueller · Sören Laue · Sascha Swiercy -
2012 Oral: Approximating Concavely Parameterized Optimization Problems »
Joachim Giesen · Jens K Mueller · Sören Laue · Sascha Swiercy