Timezone: »

Generalization Guarantee of SGD for Pairwise Learning
Yunwen Lei · Mingrui Liu · Yiming Ying

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @

Recently, there is a growing interest in studying pairwise learning since it includes many important machine learning tasks as specific examples, e.g., metric learning, AUC maximization and ranking. While stochastic gradient descent (SGD) is an efficient method, there is a lacking study on its generalization behavior for pairwise learning. In this paper, we present a systematic study on the generalization analysis of SGD for pairwise learning to understand the balance between generalization and optimization. We develop a novel high-probability generalization bound for uniformly-stable algorithms to incorporate the variance information for better generalization, based on which we establish the first nonsmooth learning algorithm to achieve almost optimal high-probability and dimension-independent generalization bounds in linear time. We consider both convex and nonconvex pairwise learning problems. Our stability analysis for convex problems shows how the interpolation can help generalization. We establish a uniform convergence of gradients, and apply it to derive the first generalization bounds on population gradients for nonconvex problems. Finally, we develop better generalization bounds for gradient-dominated problems.

Author Information

Yunwen Lei (University of Birmingham)

I am currently a Lecturer at School of Computer Science, University of Birmingham. Previously, I was a Humboldt Research Fellow at University of Kaiserslautern, a Research Assistant Professor at Southern University of Science and Technology, and a Postdoctoral Research Fellow at City University of Hong Kong. I obtained my PhD degree in Computer Science at Wuhan University in 2014.

Mingrui Liu (George Mason University)
Yiming Ying (State University of New York at Albany)

More from the Same Authors