The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method.
Subspace quasi-Newton (SQN) method has been widely used in large scale unconstrained optimization problem. Its popularity is due to the fact that the method can construct subproblems in low dimensions so that storage requirement as well as the computation cost can be minimized. However, the main drawback of the SQN method is that it can be very slow on certain types of non-linear problem such as ill-conditioned problems. Hence, we proposed a preconditioned SQN method, which is generally more effective than the SQN method. In order to achieve this, we proposed that a diagonal updating matrix that was derived based on the weak secant relation be used instead of the identity matrix to approximate the initial inverse Hessian. Our numerical results show that the proposed preconditioned SQN method performs better than the SQN method which is without preconditioning.