Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

Linear Convergence Analysis of Neural Collapse with Unconstrained Features

Peng Wang · Huikang Liu · Can Yaras · Laura Balzano · Qing Qu


Abstract:

In this work, we study the recently discovered neural collapse (NC) phenomenon, which is prevalent in training over-parameterized deep neural networks for classification tasks. Existing work has shown that any optimal solution of the trained problem for classification tasks is an NC solution and has a benign landscape under the unconstrained feature model. However, these results do not provide an answer to the question of how quickly gradient descent can find an NC solution. To answer this question, under the unconstrained feature model we prove an error bound property of the trained loss, which refers to the inequality that bounds the distance of a point in the optimal solution set by the norm of its gradient. Using this error bound, we can show linear convergence of gradient descent for finding an NC solution.

Chat is not available.