A robust accelerated optimization algorithm for strongly convex functions
2018 Annual American Control Conference (ACC), 2018•ieeexplore.ieee.org
This work proposes an accelerated first-order algorithm we call the Robust Momentum
Method for optimizing smooth strongly convex functions. The algorithm has a single scalar
parameter that can be tuned to trade off robustness to gradient noise versus worst-case
convergence rate. At one extreme, the algorithm is faster than Nesterov's Fast Gradient
Method by a constant factor but more fragile to noise. At the other extreme, the algorithm
reduces to the Gradient Method and is very robust to noise. The algorithm design technique …
Method for optimizing smooth strongly convex functions. The algorithm has a single scalar
parameter that can be tuned to trade off robustness to gradient noise versus worst-case
convergence rate. At one extreme, the algorithm is faster than Nesterov's Fast Gradient
Method by a constant factor but more fragile to noise. At the other extreme, the algorithm
reduces to the Gradient Method and is very robust to noise. The algorithm design technique …
This work proposes an accelerated first-order algorithm we call the Robust Momentum Method for optimizing smooth strongly convex functions. The algorithm has a single scalar parameter that can be tuned to trade off robustness to gradient noise versus worst-case convergence rate. At one extreme, the algorithm is faster than Nesterov's Fast Gradient Method by a constant factor but more fragile to noise. At the other extreme, the algorithm reduces to the Gradient Method and is very robust to noise. The algorithm design technique is inspired by methods from classical control theory and the resulting algorithm has a simple analytical form. Algorithm performance is verified on a series of numerical simulations in both noise-free and relative gradient noise cases.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果