Timezone: »
Poster
Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing
Zhiqi Bu · Jason Klusowski · Cynthia Rush · Weijie Su
Thu Dec 12 05:00 PM -- 07:00 PM (PST) @ East Exhibition Hall B + C #40
SLOPE is a relatively new convex optimization procedure for high-dimensional linear regression via the sorted $\ell_1$ penalty: the larger the rank of the fitted coefficient, the larger the penalty. This non-separable penalty renders many existing techniques invalid or inconclusive in analyzing the SLOPE solution. In this paper, we develop an asymptotically exact characterization of the SLOPE solution under Gaussian random designs through solving the SLOPE problem using approximate message passing (AMP). This algorithmic approach allows us to approximate the SLOPE solution via the much more amenable AMP iterates. Explicitly, we characterize the asymptotic dynamics of the AMP iterates relying on a recently developed state evolution analysis for non-separable penalties, thereby overcoming the difficulty caused by the sorted $\ell_1$ penalty. Moreover, we prove that the AMP iterates converge to the SLOPE solution in an asymptotic sense, and numerical simulations show that the convergence is surprisingly fast. Our proof rests on a novel technique that specifically leverages the SLOPE problem. In contrast to prior literature, our work not only yields an asymptotically sharp analysis but also offers an algorithmic, flexible, and constructive approach to understanding the SLOPE problem.
Author Information
Zhiqi Bu (University of Pennsylvania)
Jason Klusowski (Rutgers University)
Cynthia Rush (Columbia University)
Weijie Su (The Wharton School, University of Pennsylvania)
More from the Same Authors
-
2021 Spotlight: A Central Limit Theorem for Differentially Private Query Answering »
Jinshuo Dong · Weijie Su · Linjun Zhang -
2022 : Differentially Private Bias-Term only Fine-tuning of Foundation Models »
Zhiqi Bu · Yu-Xiang Wang · Sheng Zha · George Karypis -
2022 : Contributed Talk: Differentially Private Bias-Term only Fine-tuning of Foundation Models »
Zhiqi Bu · Yu-Xiang Wang · Sheng Zha · George Karypis -
2022 Poster: The alignment property of SGD noise and how it helps select flat minima: A stability analysis »
Lei Wu · Mingze Wang · Weijie Su -
2022 Poster: Scalable and Efficient Training of Large Convolutional Neural Networks with Differential Privacy »
Zhiqi Bu · Jialin Mao · Shiyun Xu -
2021 Poster: Fast and Memory Efficient Differentially Private-SGD via JL Projections »
Zhiqi Bu · Sivakanth Gopi · Janardhan Kulkarni · Yin Tat Lee · Judy Hanwen Shen · Uthaipon Tantipongpipat -
2021 Poster: A Central Limit Theorem for Differentially Private Query Answering »
Jinshuo Dong · Weijie Su · Linjun Zhang -
2021 Poster: You Are the Best Reviewer of Your Own Papers: An Owner-Assisted Scoring Mechanism »
Weijie Su -
2021 Poster: Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations »
Jiayao Zhang · Hua Wang · Weijie Su -
2020 : Poster Session 3 (gather.town) »
Denny Wu · Chengrun Yang · Tolga Ergen · sanae lotfi · Charles Guille-Escuret · Boris Ginsburg · Hanbake Lyu · Cong Xie · David Newton · Debraj Basu · Yewen Wang · James Lucas · MAOJIA LI · Lijun Ding · Jose Javier Gonzalez Ortiz · Reyhane Askari Hemmat · Zhiqi Bu · Neal Lawton · Kiran Thekumparampil · Jiaming Liang · Lindon Roberts · Jingyi Zhu · Dongruo Zhou -
2020 Poster: Label-Aware Neural Tangent Kernel: Toward Better Generalization and Local Elasticity »
Shuxiao Chen · Hangfeng He · Weijie Su -
2020 Poster: The Complete Lasso Tradeoff Diagram »
Hua Wang · Yachong Yang · Zhiqi Bu · Weijie Su -
2020 Spotlight: The Complete Lasso Tradeoff Diagram »
Hua Wang · Yachong Yang · Zhiqi Bu · Weijie Su -
2019 Poster: Acceleration via Symplectic Discretization of High-Resolution Differential Equations »
Bin Shi · Simon Du · Weijie Su · Michael Jordan