Timezone: »
Consider the problem of minimizing the sum of a smooth (possibly non-convex) and a convex (possibly nonsmooth) function involving a large number of variables. A popular approach to solve this problem is the block coordinate descent (BCD) method whereby at each iteration only one variable block is updated while the remaining variables are held fixed. With the recent advances in the developments of the multi-core parallel processing technology, it is desirable to parallelize the BCD method by allowing multiple blocks to be updated simultaneously at each iteration of the algorithm. In this work, we propose an inexact parallel BCD approach where at each iteration, a subset of the variables is updated in parallel by minimizing convex approximations of the original objective function. We investigate the convergence of this parallel BCD method for both randomized and cyclic variable selection rules. We analyze the asymptotic and non-asymptotic convergence behavior of the algorithm for both convex and non-convex objective functions. The numerical experiments suggest that for a special case of Lasso minimization problem, the cyclic block selection rule can outperform the randomized rule.
Author Information
Meisam Razaviyayn (University of Minnesota)
Mingyi Hong (iowa state university)
Zhi-Quan Luo (University of Minnesota, Twin Cites)
Jong-Shi Pang
More from the Same Authors
-
2023 Poster: PAC-Bayesian Spectrally-Normalized Bounds for Adversarially Robust Generalization »
Jiancong Xiao · Ruoyu Sun · Zhi-Quan Luo -
2023 Poster: Imitation Learning from Imperfection: Theoretical Justifications and Algorithms »
Ziniu Li · Tian Xu · Zeyu Qin · Yang Yu · Zhi-Quan Luo -
2022 Spotlight: Stability Analysis and Generalization Bounds of Adversarial Training »
Jiancong Xiao · Yanbo Fan · Ruoyu Sun · Jue Wang · Zhi-Quan Luo -
2022 Spotlight: Adam Can Converge Without Any Modification On Update Rules »
Yushun Zhang · Congliang Chen · Naichen Shi · Ruoyu Sun · Zhi-Quan Luo -
2022 Spotlight: Lightning Talks 6B-1 »
Yushun Zhang · Duc Nguyen · Jiancong Xiao · Wei Jiang · Yaohua Wang · Yilun Xu · Zhen LI · Anderson Ye Zhang · Ziming Liu · Fangyi Zhang · Gilles Stoltz · Congliang Chen · Gang Li · Yanbo Fan · Ruoyu Sun · Naichen Shi · Yibo Wang · Ming Lin · Max Tegmark · Lijun Zhang · Jue Wang · Ruoyu Sun · Tommi Jaakkola · Senzhang Wang · Zhi-Quan Luo · Xiuyu Sun · Zhi-Quan Luo · Tianbao Yang · Rong Jin -
2022 Poster: Adam Can Converge Without Any Modification On Update Rules »
Yushun Zhang · Congliang Chen · Naichen Shi · Ruoyu Sun · Zhi-Quan Luo -
2022 Poster: Stability Analysis and Generalization Bounds of Adversarial Training »
Jiancong Xiao · Yanbo Fan · Ruoyu Sun · Jue Wang · Zhi-Quan Luo -
2021 Poster: When Expressivity Meets Trainability: Fewer than $n$ Neurons Can Work »
Jiawei Zhang · Yushun Zhang · Mingyi Hong · Ruoyu Sun · Zhi-Quan Luo -
2016 Poster: NESTT: A Nonconvex Primal-Dual Splitting Method for Distributed and Stochastic Optimization »
Davood Hajinezhad · Mingyi Hong · Tuo Zhao · Zhaoran Wang -
2015 Poster: Improved Iteration Complexity Bounds of Cyclic Block Coordinate Descent for Convex Problems »
Ruoyu Sun · Mingyi Hong -
2014 Poster: Parallel Direction Method of Multipliers »
Huahua Wang · Arindam Banerjee · Zhi-Quan Luo -
2013 Poster: On the Linear Convergence of the Proximal Gradient Method for Trace Norm Regularization »
Ke Hou · Zirui Zhou · Anthony Man-Cho So · Zhi-Quan Luo