We developed a novel method to train a binary classifier as a thresholded sum of weak classifiers. The training proceeds by solving a non-convex optimization over binary variables that strives to minimize the training error as well as the number of weak classifiers used. Such cost functions have not been studied before. Since this is formally a NP hard optimization problem it can not be solved efficiently with classical solvers. Therefore we employ a newly developed quantum processor that implements the quantum adiabatic algorithm to find good solutions to hard binary problems. Once the detector is trained this way it can run on classical hardware. We have been able to show on several datasets that the new approach outperforms the state of the art method AdaBoost.
Hartmut Neven (Google)
More from the Same Authors
2015 : Emerging Quantum Processors and why the Machine Learning Community should care »
2014 Poster: Bayesian Sampling Using Stochastic Gradient Thermostats »
Nan Ding · Youhan Fang · Ryan Babbush · Changyou Chen · Robert D Skeel · Hartmut Neven