Inexact reduced gradient methods in smooth nonconvex optimization
Dat Tran

Abstract: The talk introduces new gradient-type methods with inexact
gradient information for finding stationary points of nonconvex continuously
differentiable functions on finite-dimensional spaces. A general scheme for
inexact reduced gradient (IRG) methods with different stepsize selections is
proposed to construct sequences of iterates with stationary accumulation
points. Convergence results with convergence rates for the developed IRG
methods are established under the Kurdyka-Lojasiewicz property. The
conducted numerical experiments confirm the efficiency of the proposed
algorithms.