Timezone: »

Strong Lottery Ticket Hypothesis with $\epsilon$–perturbation
Fangshuo Liao · Zheyang Xiong · Anastasios Kyrillidis
Event URL: https://openreview.net/forum?id=u1oRgxaAotJ »
The strong Lottery Ticket Hypothesis (LTH) claims that there exists a subnetwork in a sufficiently large, randomly initialized neural network that approximates some target neural networks without the need of training. This work extends the theoretical guarantee of the strong LTH literature to a scenario more similar to the original LTH, by generalizing the weight change achieved in the pre-training step to some perturbation around the initialization.In particular, we focus on the following open questions: By allowing an $\varepsilon$-scale perturbation on the random initial weights, can we reduce the over-parameterization requirement for the candidate network in the strong LTH? Furthermore, does the weight change by SGD coincide with a good set of such perturbation?

Author Information

Fangshuo Liao (Rice University)
Zheyang Xiong (Rice University)
Anastasios Kyrillidis (Rice University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors