ETD PDF

Accelerated Convergence of Gradient Descent Using Adaptive Parameters

Citation

Mills, Matthew. (2022-05). Accelerated Convergence of Gradient Descent Using Adaptive Parameters. Theses and Dissertations Collection, University of Idaho Library Digital Collections. https://www.lib.uidaho.edu/digital/etd/items/mills_idaho_0089n_12373.html

Title:
Accelerated Convergence of Gradient Descent Using Adaptive Parameters
Author:
Mills, Matthew
Date:
2022-05
Program:
Mathematics & Statistical Sci
Subject Category:
Mathematics
Abstract:

The Nesterov gradient descent algorithm serves as a performance benchmark forconvex optimization problems. Like many other gradient-based methods, the Nesterov algorithm requires choosing a constant step size before optimization begins, and the performance of the algorithm heavily depends on the step size. Here, we propose three novel adaptive algorithms which adaptively determine the step size based on the searching history. The new adaptive methods were tested alongside the original Nesterov algorithm on a list of commonly-used optimization test functions in a range of dimensions. The experimental results showed that they consistently outperformed the Nesterov algorithm by a wide margin. We also discuss ways that the adaptive methods could be improved.

Description:
masters, M.S., Mathematics & Statistical Sci -- University of Idaho - College of Graduate Studies, 2022-05
Major Professor:
Gao, Fuchang
Committee:
Barannyk, Lyudmyla; Nguyen, Linh; Abo, Hirotachi
Defense Date:
2022-05
Identifier:
Mills_idaho_0089N_12373
Type:
Text
Format Original:
PDF
Format:
application/pdf

Contact us about this record

Rights
Rights:
In Copyright - Educational Use Permitted. For more information, please contact University of Idaho Library Special Collections and Archives Department at libspec@uidaho.edu.
Standardized Rights:
http://rightsstatements.org/vocab/InC-EDU/1.0/