Displaying 1 publication

Abstract:
Sort:
  1. Choong Boon Ng, Wah June Leong, Mansor Monsi
    MyJurnal
    The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links