Intelligent optimization: Literature review and state-of-the-art algorithms (1965–2022)
A Mohammadi, F Sheikholeslam - Engineering Applications of Artificial …, 2023 - Elsevier
Today, intelligent optimization has become a science that few researchers have not used in
dealing with problems in their field. Diversity and flexibility have made the use, efficiency …
dealing with problems in their field. Diversity and flexibility have made the use, efficiency …
Nature-inspired metaheuristic search algorithms for optimizing benchmark problems: inclined planes system optimization to state-of-the-art methods
In the literature, different types of inclined planes system optimization (IPO) algorithms have
been proposed and evaluated in various applications. Due to the large number of variants …
been proposed and evaluated in various applications. Due to the large number of variants …
Quasi-Newton methods for machine learning: forget the past, just sample
We present two sampled quasi-Newton methods (sampled LBFGS and sampled LSR1) for
solving empirical risk minimization problems that arise in machine learning. Contrary to the …
solving empirical risk minimization problems that arise in machine learning. Contrary to the …
Inclined planes system optimization: theory, literature review, and state-of-the-art versions for IIR system identification
Abstract The Inclined Planes System Optimization (IPO) algorithm is recent algorithm that
uses Newton's second law to perform optimization. After conducting a thorough literature …
uses Newton's second law to perform optimization. After conducting a thorough literature …
Doubly adaptive scaled algorithm for machine learning using second-order information
We present a novel adaptive optimization algorithm for large-scale machine learning
problems. Equipped with a low-cost estimate of local curvature and Lipschitz smoothness …
problems. Equipped with a low-cost estimate of local curvature and Lipschitz smoothness …
An overview of stochastic quasi-Newton methods for large-scale machine learning
TD Guo, Y Liu, CY Han - Journal of the Operations Research Society of …, 2023 - Springer
Numerous intriguing optimization problems arise as a result of the advancement of machine
learning. The stochastic first-order method is the predominant choice for those problems due …
learning. The stochastic first-order method is the predominant choice for those problems due …
Flecs: A federated learning second-order framework via compression and sketching
Inspired by the recent work FedNL (Safaryan et al, FedNL: Making Newton-Type Methods
Applicable to Federated Learning), we propose a new communication efficient second-order …
Applicable to Federated Learning), we propose a new communication efficient second-order …
LSOS: Line-search Second-Order Stochastic optimization methods for nonconvex finite sums
D Di Serafino, N Krejić, N Krklec Jerinkić… - Mathematics of …, 2023 - ams.org
We develop a line-search second-order algorithmic framework for minimizing finite sums.
We do not make any convexity assumptions, but require the terms of the sum to be …
We do not make any convexity assumptions, but require the terms of the sum to be …
Stochastic gradient methods with preconditioned updates
This work considers the non-convex finite-sum minimization problem. There are several
algorithms for such problems, but existing methods often work poorly when the problem is …
algorithms for such problems, but existing methods often work poorly when the problem is …
Generalization of Quasi-Newton methods: application to robust symmetric multisecant updates
D Scieur, L Liu, T Pumir… - … Conference on Artificial …, 2021 - proceedings.mlr.press
Quasi-Newton (qN) techniques approximate the Newton step by estimating the Hessian
using the so-called secant equations. Some of these methods compute the Hessian using …
using the so-called secant equations. Some of these methods compute the Hessian using …