Timezone: »

 
The Effect of Data Dimensionality on Neural Network Prunability
Zachary Ankner · Alex Renda · Gintare Karolina Dziugaite · Jonathan Frankle · Tian Jin
Event URL: https://openreview.net/forum?id=4OTBOcNkXBx »

Practitioners often prune neural networks for efficiency gains and generalization improvements, but few scrutinize the factors determining the prunability of a neural network – the maximum fraction of weights that pruning can remove without compromising the model’s test accuracy. In this work, we study the properties of input data that may contribute to the prunability of a neural network. For high dimensional input data such as images, text, and audio, the manifold hypothesis suggests that these high dimensional inputs actually lie on or near a significantly lower dimensional manifold. Prior work demonstrates that the underlying low dimensional structure of the input data may affect the sample efficiency of learning. In this paper, we investigate whether the low dimensional structure of the input data affects the prunability of a neural network.

Author Information

Zachary Ankner (Massachusetts Institute of Technology)
Alex Renda (MIT)
Gintare Karolina Dziugaite (Google Research, Brain Team)
Jonathan Frankle (MIT CSAIL)
Tian Jin (Massachusetts Institute of Technology)

More from the Same Authors