A New Family of Hybrid Conjugate Gradient Methods for Unconstrained Optimization

dc.contributor.authorAdeleke, Olawale
dc.date.accessioned2022-03-21T08:43:34Z
dc.date.available2022-03-21T08:43:34Z
dc.date.issued2021-06
dc.description.abstractThe conjugate gradient method is a very efficient iterative technique for solving large-scale unconstrained optimization problems. Motivated by recent modifications of some variants of the method and construction of hybrid methods, this study proposed four hybrid methods that are globally convergent as well as computationally efficient. The approach adopted for constructing the hybrid methods entails projecting ten recently modified conjugate gradient methods. Each of the hybrid methods is shown to satisfy the descent property independent of any line search technique and globally convergent under the influence of strong Wolfe line search. Results obtained from numerical implementation of these methods and performance profiling show that the methods are very competitive with well-known traditional methods.en_US
dc.identifier.issn10.19139/soic.v7i3.480
dc.identifier.urihttp://dspace.run.edu.ng:8080/jspui/handle/123456789/2081
dc.language.isoenen_US
dc.subjectHybrid methodsen_US
dc.subjectNonlinear conjugate gradient methoden_US
dc.subjectUnconstrained optimization problemsen_US
dc.subjectDescent propertyen_US
dc.subjectGlobal convergenceen_US
dc.subjectStrong Wolfe line search conditionsen_US
dc.titleA New Family of Hybrid Conjugate Gradient Methods for Unconstrained Optimizationen_US
dc.typeArticleen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
A new family of hybrid cgm.pdf
Size:
301.79 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: