Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2021: Optimization for Machine Learning

Stochastic Learning Equation using Monotone Increasing Resolution of Quantization

Jinwuk Seok ·


Abstract:

In this paper, we propose a quantized learning equation with a monotone increasing resolution of quantization and stochastic analysis for the proposed algorithm. According to the white noise hypothesis for the quantization error with dense and uniform distribution, we can regard the quantization error as i.i.d. white noise. Based on this, we show that the learning equation with monotonically increasing quantization resolution converges weakly as the distribution viewpoint. The analysis of this paper shows that global optimization is possible for a domain that satisfies the Lipschitz condition instead of local convergence properties such as the Hessian constraint of the objective function.

Chat is not available.