Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2021: Optimization for Machine Learning

Barzilai and Borwein conjugate gradient method equipped with a non-monotone line search technique

Sajad Fathi Hafshejani · Daya Gaur · Shahadat Hossain · Robert Benkoczi


Abstract:

In this paper, we propose a new non-monotone conjugate gradient method for solving unconstrained nonlinear optimization problems. We first modify the non-monotone line search method by introducing a new trigonometric function to calculate the non-monotone parameter, which plays an essential role in the algorithm's efficiency. Then, we apply a convex combination of the Barzilai-Borwein method \cite{Barzilai} for calculating the value of step size in each iteration. Under some suitable assumptions, we prove that the new algorithm has the global convergence property. The efficiency and effectiveness of the proposed method are determined in practice by applying the algorithm to some standard test problems and non-negative matrix factorization problems.