In this paper, we propose a quantized learning equation with a monotone increasing resolution of quantization and stochastic analysis for the proposed algorithm. According to the white noise hypothesis for the quantization error with dense and uniform distribution, we can regard the quantization error as i.i.d. white noise. Based on this, we show that the learning equation with monotonically increasing quantization resolution converges weakly as the distribution viewpoint. The analysis of this paper shows that global optimization is possible for a domain that satisfies the Lipschitz condition instead of local convergence properties such as the Hessian constraint of the objective function.
Jinwuk Seok (Electronics and Telecommunications Research Institute)
Jinwuk Seok received the B.S. and M.S. degrees in electrical control engineering, Seoul, Korea in 1993 and in 1995, respectively, and the Ph.D. degree in electrical engineering also Seoul, Korea in 1998. He has been a principal members of engineering staff at Electronics and Telecommunications Research Institute (ETRI) in Korea since 2000 and an adjunct professor of computer software engineering department at university of science and technology (UST) in Korea since 2009. His research interests include video compression, machine learning, and stochastic nonlinear control.
More from the Same Authors
2021 : Poster Session 2 (gather.town) »
Wenjie Li · Akhilesh Soni · Jinwuk Seok · Jianhao Ma · Jeffery Kline · Mathieu Tuli · Miaolan Xie · Robert Gower · Quanqi Hu · Matteo Cacciola · Yuanlu Bai · Boyue Li · Wenhao Zhan · Shentong Mo · Junhyung Lyle Kim · Sajad Fathi Hafshejani · Chris Junchi Li · Zhishuai Guo · Harshvardhan Harshvardhan · Neha Wadia · Tatjana Chavdarova · Difan Zou · Zixiang Chen · Aman Gupta · Jacques Chen · Betty Shea · Benoit Dherin · Aleksandr Beznosikov