Timezone: »

 
Poster
Validating the Lottery Ticket Hypothesis with Inertial Manifold Theory
Zeru Zhang · Jiayin Jin · Zijie Zhang · Yang Zhou · Xin Zhao · Jiaxiang Ren · Ji Liu · Lingfei Wu · Ruoming Jin · Dejing Dou

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @

Despite achieving remarkable efficiency, traditional network pruning techniques often follow manually-crafted heuristics to generate pruned sparse networks. Such heuristic pruning strategies are hard to guarantee that the pruned networks achieve test accuracy comparable to the original dense ones. Recent works have empirically identified and verified the Lottery Ticket Hypothesis (LTH): a randomly-initialized dense neural network contains an extremely sparse subnetwork, which can be trained to achieve similar accuracy to the former. Due to the lack of theoretical evidence, they often need to run multiple rounds of expensive training and pruning over the original large networks to discover the sparse subnetworks with low accuracy loss. By leveraging dynamical systems theory and inertial manifold theory, this work theoretically verifies the validity of the LTH. We explore the possibility of theoretically lossless pruning as well as one-time pruning, compared with existing neural network pruning and LTH techniques. We reformulate the neural network optimization problem as a gradient dynamical system and reduce this high-dimensional system onto inertial manifolds to obtain a low-dimensional system regarding pruned subnetworks. We demonstrate the precondition and existence of pruned subnetworks and prune the original networks in terms of the gap in their spectrum that make the subnetworks have the smallest dimensions.

Author Information

Zeru Zhang (Auburn University)
Jiayin Jin (Auburn University)
Zijie Zhang (Auburn University)
Yang Zhou (Auburn University)
Xin Zhao (Auburn University)
Jiaxiang Ren (Auburn University)
Ji Liu (Baidu)
Lingfei Wu (JD.COM Silicon Valley Research Center)

Dr. Lingfei Wu earned his Ph.D. degree in computer science from the College of William and Mary in 2016. He is a research staff member at IBM Research and is leading a research team (10+ RSMs) for developing novel Graph Neural Networks for various tasks, which leads to the #1 AI Challenge Project in IBM Research and multiple IBM Awards including Outstanding Technical Achievement Award. He has published more than 70 top-ranked conference and journal papers and is a co-inventor of more than 30 filed US patents. Because of the high commercial value of his patents, he has received several invention achievement awards and has been appointed as IBM Master Inventors, class of 2020. He was the recipients of the Best Paper Award and Best Student Paper Award of several conferences such as IEEE ICC’19, AAAI workshop on DLGMA’20 and KDD workshop on DLG'19. His research has been featured in numerous media outlets, including NatureNews, YahooNews, Venturebeat, and TechTalks. He has co-organized 10+ conferences (AAAI, IEEE BigData) and is the founding co-chair for Workshops of Deep Learning on Graphs (with AAAI’21, AAAI’20, KDD’20, KDD’19, and IEEE BigData’19). He has currently served as Associate Editor for  IEEE Transactions on Neural Networks and Learning Systems, ACM Transactions on Knowledge Discovery from Data and International Journal of Intelligent Systems, and regularly served as a SPC/PC member of the following major AI/ML/NLP conferences including KDD, IJCAI, AAAI, NIPS, ICML, ICLR, and ACL.

Ruoming Jin (Kent State University)
Dejing Dou (" University of Oregon, USA")

More from the Same Authors