Skip to main content
To KTH's start page

Sebastian Fodor: Understanding Convergence of Accelerated Gradient Descent Methods

BSc thesis presentation

Time: Wed 2020-06-03 08.30 - 09.30

Location: Zoom, meeting ID: 696 0792 4408

Participating: Sebastian Fodor

Supervisor: Yishao Zhou

Export to calendar

Abstract

In this paper we study a generalized version of Nesterovs Accelerated Gradient Method (NAG) that has three parameters instead of two, with one additional parameter for the amount of gradient correction. This gives the Generalized Nesterovs Accelerated Gradient Method (GNAG), of which both NAG and Polyak’s heavy-ball method are special cases. We derive a differential equation that approximates this method and show that it converges with linear rate for functions of strong convexity and Lipschitz continuous gradients. We also consider GNAG as a dynamical system, and show that this dynamical system converges with linear rate to a steady state which is the optimal solution. We ask the question whether GNAG converges faster than NAG for certain choices of the gradient correction parameter, and by numerical examples arrive at the conclusion that a higher gradient correction parameter can lead to faster convergence.