Research Article  Open Access
Zhan Wang, Pengyuan Li, Xiangrong Li, Hongtruong Pham, "A Modified ThreeTerm Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems", Mathematical Problems in Engineering, vol. 2020, Article ID 4381515, 14 pages, 2020. https://doi.org/10.1155/2020/4381515
A Modified ThreeTerm Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems
Abstract
Conjugate gradient methods are wellknown methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified threeterm type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified threeterm type CD conjugate gradient formula is presented. (ii) The given algorithm possesses sufficient descent property and trust region property. (iii) The algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for general function. The new algorithm has made great progress in numerical experiments. It shows that the modified threeterm type CD conjugate gradient method is more competitive than the classical CD conjugate gradient method.
1. Introduction
Considering the problemwhere is a continuously differentiable function. This kind of model is often used to solve some problems in applied mathematics, economics, engineering, and so on. Generally, the following iteration formula is used to generate the next iteration point:where and denote the next iteration point and current iteration point, respectively. is a steplength and is the search direction. The search direction generated by Conjugated Gradient (CG) method is defined by the following formula:where is the gradient of at and is a parameter. Different will generate different CG methods [1–10]. There are six classical forms of :where , and is the Euclidean norm. Those formulas can be divided into two categories: One includes PRP method, HS method, and LS method, which have good numerical performance; the other includes FR method, CD method, and DY method, which have good theory convergence. About these methods, many scholars had applied them to solve nonlinear monotone equations and normal optimization problems, and some good results were achieved [11–16].
Zhang et al. [17] presented a modified PRP CG formula as follows:
They proved that the modified PRP method is globally convergent with Armijotype line search.
Yuan et al. [18] proposed another modified PRP CG formula withwhere and . Yuan et al. [18] obtained the global convergence of the modified PRP method with a modified weak Wolfe–Powell (MWWP) line search technique, which was proposed by Yuan et al. [19]:where , , and . Yuan et al. [16] obtained the global convergence of PRP method by using the above modified weak Wolfe–Powell line search technique and a projection technique:where is the next point and parameter . With this projection technique, any unsatisfactory iteration points generated by the normal PRP algorithm will be projected onto a surface to overcome the failure to converge.
Motivated by the above researches, a modified threeterm CD conjugate gradient algorithm is presented with (7), (8), and (9) in this paper. Some good properties are obtained as follows:(i)A modified threeterm type CD conjugate gradient formula is presented(ii)The given algorithm possesses sufficient descent property and trust region property(iii)The algorithm has global convergence with the MWWP line search technique and projection technique for general function
This paper is organized as follows: the next section will introduce the modified CD formula and relative algorithm; section 3 gives the proof of the global convergence of the new algorithm; numerical experiments are given in section 4; some conclusions are presented in section 5. Throughout this paper, we use to denote the Euclidean norm, and are replaced by and , respectively.
2. Motivation and Algorithm
The convergence of CD conjugate gradient method has been proved [4]; however, the numerical results of this method are worse than the PRP method and others. Therefore, it is necessary to propose a new search direction to improve the numerical performance of the CD method. Meanwhile, the sufficient descent property is significant for obtaining the convergence of the conjugate gradient method:
Then, we also hope that we can propose a new method that possesses this property. Inspired by the above discussion, a modified threeterm type CD conjugate gradient formula is designed by the following:where , and . The steps of the given algorithm are listed as follows.
3. Convergence Analysis
In this section, we are going to analyse the convergence of the proposed algorithm. The following assumptions are needed.
Assumption 1. (i)The level set is bounded.(ii) is twice continuously differentiable and bounded below, and the gradient function is Lipschitz continuous, which means that there exists a constant such that
Lemma 1. Let the search direction be generated by formula (11), then, the following relations hold:where constants .
Proof. According to the definition of ,For the parameter , we have the following:then holds. Letting , the relation (13) is obtained.
Using the definition of the parameter , we analyse the value of in two cases:
Case 1. . Similar to Lemma 1 of [20], , where is a scalar; then
Case 2. . ThenLetting , we have . Thus,Let ; then (14) holds. The proof is complete.
Remark 1. The relation (14) shows that the optimization algorithm possesses the trust region feature.
The following theorem is obtained for the global convergence of Algorithm 1.

Theorem 1. Assume that and are generated by Algorithm 1 and Lemma 1 holds. Then,
Proof. From the line search (8), we have the following:According to (ii) of Assumption 1,Then, we have the following:Using Lemma 1,From Assumption 1, line search (7) and sufficient descent property (13),Summing these inequalities from to ,From Assumption 1, it is easy to know that is bounded. Then, . The proof is complete.
4. Numerical Results
This section will report the numerical experiments with some classical optimization problems, the nonlinear Muskingum model, and the application in image restoration problems. All the tests are coded in MATLAB R2014a, run on a PC with a 2.50 GHz CPU, and 4.00 GB of memory running the Windows 10 operating system.
4.1. Normal Unconstrained Optimization Problems
In this subsection, the numerical experiments would be done with some test problems from [20], and all test problems are listed in Table 1. We compare Algorithm 1 with the classical CD conjugated method (called Algorithm 2) and the classical PRP conjugated method (called Algorithm 3). The detailed experimental data are list in Table 2. Figures 1–3 show the performance of these three algorithms related to CPU, NI, and NFG. Some columns of Tables 1 and 2 and Figures 1–3 have the following meanings:(i)No: the serial number of the problem.(ii)Dim: the dimension of the variable .(iii)NI: the iteration numbers.(iv)NFG: the sum of function value and gradient value.(v)CPU: the calculation time in seconds.(vi)Dimension: the dimensions are 3000, 9000, and 15000.(vii)Initialization: the parameters of the algorithms are chosen to be , , and , and the initial search direction .(viii)Stop rules: the following Himmeblau stop rule [21] is used: if , let ; otherwise, let . For every problem, if the conditions or are satisfied, then the program is stopped. This program is also stopped when the number of iterations is greater than one thousand.


From the detailed experimental data of Table 2, it is obvious to see that most of the problems can be solved quickly. For most of the problems, it takes less CPU time to solve those problems with the proposed algorithm. Meanwhile, progress has also been made in NI and NFG. Generally, the algorithm of the proposed method is promising versus the other algorithms. About the numerical results, the algorithm of Dolan and More [22] will be used to more directly show the performance profiles of these algorithms. In Figure 1, the curve of Algorithm 1 is always above the other algorithms. In Figure 2, the curves have the same trend. Algorithm 1 almost solves about of the test problems in , while Algorithm 2 just solves and Algorithm 3 just solves in . Figure 3 shows a similar trend as Figure 2. All the above pictures show that the modified CD conjugate gradient algorithm is more robust and effective compared with the normal CD method and PRP method. In summary, Algorithm 1 is more competitive versus others.
4.2. The Muskingum Model
It is generally known that parameter estimation is a significant task in engineering applications. The nonlinear Muskingum model will be discussed as a common example of such an application in this subsection.
The Muskingum Model [23] is defined by the following:
Some of the variables have the following meanings:(i): the total time number.(ii): the storage time constant.(iii): the observed inflow discharge.(iv): the weighting factor.(v): the observed outflow discharges.(vi) : an additional parameter.(vii) : the time step at time .
In the experiment, the observed data of the flood runoff process from Chenggouwan and Linqing of Nanyunhe River in the Haihe Basin, Tianjin, China, are used. We choose the initial point ; detailed data about and for the years 1960, 1961, and 1964 were obtained (see [24] in detail). was selected. The results of these three algorithms are listed in Table 3. The performance of the presented algorithm is shown in Figures 4–6.
Some conclusions are obtained from this experiment: (1) from Figures 4–6, we conclude that we can calculate approximations of flood outflows by using Algorithm 1, and Algorithm 1 is effective for the nonlinear Muskingum model; (2) the final points (, and ) of these three algorithms are interesting, and they are competitive with the final points of similar algorithms; and (3) the final points of Algorithm 1 are different from those of the BFGS method and HIWO method, which implies that the nonlinear Muskingum model has different optimum approximation solutions.
4.3. Image Restoration Problems
In this subsection, the above algorithms will be applied to image restoration problems. The original images corrupted by impulse noise are treated as objects here. These problems are regarded as one of the most difficult problems in optimization fields. Related parameter settings are similar to the above subsections, and the program will be stopped when the condition or holds. The following three images are selected as processing objects: Banoon (512512), Barbara (512512), and Lena (512512). The detail performances are shown in Figures 7–9. The CPU time taken to process images is listed in Table 4.

It is easy to see that all the algorithms are successful for image restoration problems. The results in Table 4 reveal that the CPU time of Algorithm 1 is less than that of other algorithms, whether for $30\%$ noise problems, noise problems, or for noise problems.
5. Conclusion
In this paper, a modified threeterm type CD conjugate gradient algorithm is presented. Some good features are also presented: (i) sufficient descent property holds, (ii) trust region feature also holds, (iii) the algorithm has global convergence with the MWWP line search technique and projection technique for general function, and (iv) numerical results reveal that the new algorithm is more competitive than the normal CD algorithm and PRP algorithm.
In recent years, there have been considerable researches about other types of CG method, while the study of CD method is not enough, and it should not be ignored. We have many works to do in the future: whether this method is suitable for other line search technique (such as Armijo line search, nonmonotone line search), or whether there exist other better modification methods to improve the numerical results of the CD method. All these are worth studying in the next work.
Data Availability
The data used to support the findings of this study are included within the article.
Conflicts of Interest
The authors declare that there are no conflicts of interest regarding the publication of this paper.
Acknowledgments
This work was supported by the National Natural Science Foundation of China (Grant no. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi Institutions of Higher Education (Grant No.[2019]52), and the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046). The authors would like to thank the editor and the referees for their valuable comments, which greatly improve this paper.
References
 M. AlBaali, “Descent property and global convergence of the fletcherreeves method with inexact line search,” IMA Journal of Numerical Analysis, vol. 5, no. 1, pp. 121–124, 1985. View at: Publisher Site  Google Scholar
 Y. Dai and Y. Yuan, “A nonlinear conjugate gradient with a strong global convergence properties,” SIAM Journal on Optimization, vol. 10, pp. 177–182, 2000. View at: Publisher Site  Google Scholar
 J. W. Daniel, “The conjugate gradient method for linear and nonlinear operator equations,” SIAM Journal on Numerical Analysis, vol. 4, no. 1, pp. 10–26, 1967. View at: Publisher Site  Google Scholar
 R. Fletcher, Practical Methods of Optimization, Wiley, New York, NY, USA, 2nd. edition, 1987.
 R. Fletcher and C. M. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 1964. View at: Publisher Site  Google Scholar
 M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952. View at: Publisher Site  Google Scholar
 Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991. View at: Publisher Site  Google Scholar
 E. Polak and G. Ribiere, “Note sur la convergence de méthodes de directions conjuguées,” Revue française d’informatique et de recherche opérationnelle. Série rouge, vol. 3, no. 16, pp. 35–43, 1969. View at: Publisher Site  Google Scholar
 B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969. View at: Publisher Site  Google Scholar
 G. Yuan, X. Wang, and Z. Sheng, “Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions,” Numerical Algorithms, vol. 84, no. 3, pp. 935–956, 2020. View at: Publisher Site  Google Scholar
 S.Y. Liu, Y.Y. Huang, and H.W. Jiao, “Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations,” Abstract and Applied Analysis, vol. 2014, pp. 1–12, 2014. View at: Publisher Site  Google Scholar
 X. Y. Wang, S. J. Li, and X. P. Kou, “A selfadaptive threeterm conjugate gradient method for monotone nonlinear equations with convex constraints,” Calcolo, vol. 53, no. 2, pp. 133–145, 2016. View at: Publisher Site  Google Scholar
 G. Yuan and W. Hu, “A conjugate gradient algorithm for largescale unconstrained optimization problems and nonlinear equations,” Journal of Inequalities and Applications, vol. 113, no. 1, 2018. View at: Publisher Site  Google Scholar
 G. Yuan, T. Li, and W. Hu, “A conjugate gradient algorithm for largescale nonlinear equations and image restoration problems,” Applied Numerical Mathematics, vol. 147, pp. 129–141, 2020. View at: Publisher Site  Google Scholar
 G. Yuan, J. Lu, and Z. Wang, “The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems,” Applied Numerical Mathematics, vol. 152, pp. 1–11, 2020. View at: Publisher Site  Google Scholar
 G. Yuan, Z. Wei, and Y. Yang, “The global convergence of the polakribièrepolyak conjugate gradient algorithm under inexact line search for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 362, pp. 262–275, 2019. View at: Publisher Site  Google Scholar
 L. Zhang, W. Zhou, and D.H. Li, “A descent modified polakribièrepolyak conjugate gradient method and its global convergence,” IMA Journal of Numerical Analysis, vol. 26, no. 4, pp. 629–640, 2006. View at: Publisher Site  Google Scholar
 G. Yuan, W. Hu, and Z. Sheng, “A conjugate gradient algorithm with yuanweilu line search,” in International Conference on Cloud Computing and Security, Springer, Cham, Switzerland, 2017. View at: Google Scholar
 G. Yuan, Z. Wei, and X. Lu, “Global convergence of BFGS and PRP methods under a modified weak WolfePowell line search,” Applied Mathematical Modelling, vol. 47, pp. 811–825, 2017. View at: Publisher Site  Google Scholar
 G. Yuan, Z. Sheng, B. Wang, W. Hu, and C. Li, “The global convergence of a modified BFGS method for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 327, pp. 274–294, 2018. View at: Publisher Site  Google Scholar
 Y. Yuan and W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, China, 1999.
 E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002. View at: Publisher Site  Google Scholar
 A. Ouyang, L.B. Liu, Z. Sheng, and F. Wu, “A class of parameter estimation methods for nonlinear muskingum model using hybrid invasive weed optimization algorithm,” Mathematical Problems in Engineering, vol. 2015, pp. 1–15, 2015. View at: Publisher Site  Google Scholar
 A. Ouyang, Z. Tang, K. Li, A. Sallam, and E. Sha, “Estimating parameters of muskingum model using an adaptive hybrid PSO algorithm,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 28, pp. 1–29, 2014. View at: Publisher Site  Google Scholar
 Z. W. Geem, “Parameter estimation for the nonlinear muskingum model using the BFGS technique,” Journal of Irrigation and Drainage Engineering, vol. 132, no. 5, pp. 474–478, 2006. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2020 Zhan Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.