Timezone: »

 
Poster
GENO -- GENeric Optimization for Classical Machine Learning
Sören Laue · Matthias Mitterreiter · Joachim Giesen

Thu Dec 12 05:00 PM -- 07:00 PM (PST) @ East Exhibition Hall B + C #181

Although optimization is the longstanding, algorithmic backbone of machine learning new models still require the time-consuming implementation of new solvers. As a result, there are thousands of implementations of optimization algorithms for machine learning problems. A natural question is, if it is always necessary to implement a new solver, or is there one algorithm that is sufficient for most models. Common belief suggests that such a one-algorithm-fits-all approach cannot work, because this algorithm cannot exploit model specific structure. At least, a generic algorithm cannot be efficient and robust on a wide variety of problems. Here, we challenge this common belief. We have designed and implemented the optimization framework GENO (GENeric Optimization) that combines a modeling language with a generic solver. GENO takes the declaration of an optimization problem and generates a solver for the specified problem class. The framework is flexible enough to encompass most of the classical machine learning problems. We show on a wide variety of classical but also some recently suggested problems that the automatically generated solvers are (1) as efficient as well engineered, specialized solvers, (2) more efficient by a decent margin than recent state-of-the-art solvers, and (3) orders of magnitude more efficient than classical modeling language plus solver approaches.

Author Information

Sören Laue (Friedrich Schiller University Jena / Data Assessment Solutions)
Matthias Mitterreiter (Friedrich Schiller University Jena)
Joachim Giesen (Friedrich-Schiller-Universitat Jena)

More from the Same Authors