A Review of multilayer extreme learning machine neural networks
Abstract The Extreme Learning Machine is a single-hidden-layer feedforward learning
algorithm, which has been successfully applied in regression and classification problems in …
algorithm, which has been successfully applied in regression and classification problems in …
Acceleration methods
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …
frequently used in convex optimization. We first use quadratic optimization problems to …
Understanding the acceleration phenomenon via high-resolution differential equations
Gradient-based optimization algorithms can be studied from the perspective of limiting
ordinary differential equations (ODEs). Motivated by the fact that existing ODEs do not …
ordinary differential equations (ODEs). Motivated by the fact that existing ODEs do not …
An operator splitting approach for distributed generalized Nash equilibria computation
In this paper, we propose a distributed algorithm for computation of a generalized Nash
equilibrium (GNE) in noncooperative games over networks. We consider games in which the …
equilibrium (GNE) in noncooperative games over networks. We consider games in which the …
Fast optimization via inertial dynamics with closed-loop damping
In a real Hilbert space H, in order to develop fast optimization methods, we analyze the
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
asymptotic behavior, as time t tends to infinity, of a large class of autonomous dissipative …
Inertial projection and contraction algorithms for variational inequalities
In this article, we introduce an inertial projection and contraction algorithm by combining
inertial type algorithms with the projection and contraction algorithm for solving a variational …
inertial type algorithms with the projection and contraction algorithm for solving a variational …
Golden ratio algorithms for variational inequalities
Y Malitsky - Mathematical Programming, 2020 - Springer
The paper presents a fully adaptive algorithm for monotone variational inequalities. In each
iteration the method uses two previous iterates for an approximation of the local Lipschitz …
iteration the method uses two previous iterates for an approximation of the local Lipschitz …
First-order optimization algorithms via inertial systems with Hessian driven damping
In a Hilbert space setting, for convex optimization, we analyze the convergence rate of a
class of first-order algorithms involving inertial features. They can be interpreted as discrete …
class of first-order algorithms involving inertial features. They can be interpreted as discrete …
Proximal gradient method for nonsmooth optimization over the Stiefel manifold
We consider optimization problems over the Stiefel manifold whose objective function is the
summation of a smooth function and a nonsmooth function. Existing methods for solving this …
summation of a smooth function and a nonsmooth function. Existing methods for solving this …
A dynamical systems perspective on Nesterov acceleration
M Muehlebach, M Jordan - International Conference on …, 2019 - proceedings.mlr.press
We present a dynamical system framework for understanding Nesterov's accelerated
gradient method. In contrast to earlier work, our derivation does not rely on a vanishing step …
gradient method. In contrast to earlier work, our derivation does not rely on a vanishing step …