Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
This paper includes a twofold result for the Nonlinear Conjugate Gradient (NCG) method, in
large scale unconstrained optimization. First we consider a theoretical analysis, where
preconditioning is embedded in a strong convergence framework of an NCG method from
the literature. Mild conditions to be satisfied by the preconditioners are defined, in order to
preserve NCG convergence. As a second task, we also detail the use of novel matrix–free
preconditioners for NCG. Our proposals are based on quasi–Newton updates, and either …
large scale unconstrained optimization. First we consider a theoretical analysis, where
preconditioning is embedded in a strong convergence framework of an NCG method from
the literature. Mild conditions to be satisfied by the preconditioners are defined, in order to
preserve NCG convergence. As a second task, we also detail the use of novel matrix–free
preconditioners for NCG. Our proposals are based on quasi–Newton updates, and either …
以上显示的是最相近的搜索结果。 查看全部搜索结果