Timezone: »

SQ Lower Bounds for Learning Single Neurons with Massart Noise
Ilias Diakonikolas · Daniel Kane · Lisheng Ren · Yuxin Sun

Thu Dec 01 02:00 PM -- 04:00 PM (PST) @ Hall J #829
We study the problem of PAC learning a single neuron in the presence of Massart noise. Specifically, for a known activation function $f: \mathbb{R}\to \mathbb{R}$, the learner is given access to labeled examples $(\mathbf{x}, y) \in \mathbb{R}^d \times \mathbb{R}$, where the marginal distribution of $\mathbf{x}$ is arbitrary and the corresponding label $y$ is a Massart corruption of $f(\langle \mathbf{w}, \mathbf{x} \rangle)$. The goal of the learner is to output a hypothesis $h: \mathbb{R}^d \to \mathbb{R}$ with small squared loss. For a range of activation functions, including ReLUs, we establish super-polynomial Statistical Query (SQ) lower bounds for this learning problem. In more detail, we prove that no efficient SQ algorithm can approximate the optimal error within any constant factor. Our main technical contribution is a novel SQ-hard construction for learning $\{ \pm 1\}$-weight Massart halfspaces on the Boolean hypercube that is interesting on its own right.

Author Information

Ilias Diakonikolas (University of Wisconsin-Madison)
Daniel Kane (UCSD)
Lisheng Ren (University of Wisconsin-Madison)
Yuxin Sun (Department of Computer Science, University of Wisconsin, Madison)

More from the Same Authors