## Journal of Applied Mathematics

### A Relax Inexact Accelerated Proximal Gradient Method for the Constrained Minimization Problem of Maximum Eigenvalue Functions

#### Abstract

For constrained minimization problem of maximum eigenvalue functions, since the objective function is nonsmooth, we can use the approximate inexact accelerated proximal gradient (AIAPG) method (Wang et al., 2013) to solve its smooth approximation minimization problem. When we take the function $g(X)={\delta }_{\mathrm{\Omega }}(X) (\mathrm{\Omega }:=\{X\in {S}^{n}:\mathcal{F}(X)=b,X⪰\mathrm{0}\})$ in the problem $\text{min}\{{\lambda }_{\text{max}}(X)+g(X):X\in {S}^{n}\}$, where ${\lambda }_{\text{max}}(X)$ is the maximum eigenvalue function, $g(X)$ is a proper lower semicontinuous convex function (possibly nonsmooth) and ${\delta }_{\mathrm{\Omega }}(X)$ denotes the indicator function. But the approximate minimizer generated by AIAPG method must be contained in $\mathrm{\Omega }$ otherwise the method will be invalid. In this paper, we will consider the case where the approximate minimizer cannot be guaranteed in $\mathrm{\Omega }$. Thus we will propose two different strategies, respectively, constructing the feasible solution and designing a new method named relax inexact accelerated proximal gradient (RIAPG) method. It is worth mentioning that one advantage when compared to the former is that the latter strategy can overcome the drawback. The drawback is that the required conditions are too strict. Furthermore, the RIAPG method inherits the global iteration complexity and attractive computational advantage of AIAPG method.

#### Article information

Source
J. Appl. Math., Volume 2014 (2014), Article ID 749475, 7 pages.

Dates
First available in Project Euclid: 2 March 2015

https://projecteuclid.org/euclid.jam/1425305890

Digital Object Identifier
doi:10.1155/2014/749475

Mathematical Reviews number (MathSciNet)
MR3232929

#### Citation

Wang, Wei; Li, Shanghua; Gao, Jingjing. A Relax Inexact Accelerated Proximal Gradient Method for the Constrained Minimization Problem of Maximum Eigenvalue Functions. J. Appl. Math. 2014 (2014), Article ID 749475, 7 pages. doi:10.1155/2014/749475. https://projecteuclid.org/euclid.jam/1425305890

#### References

• E. S. Mistakidis and G. E. Stavroulakis, Nonconvex Optimization in Mechanics. Smooth and Nonsmooth Algotithms, Heuristics and Engineering Applications, Kluwer Academic Publisher, Dordrechet, The Netherlands, 1998.
• P. L. Combettes and V. R. Wajs, “Signal recovery by proximal forward-backward splitting,” Multiscale Modeling & Simulation, vol. 4, no. 4, pp. 1168–1200, 2005.
• C. A. Micchelli, L. Shen, and Y. Xu, “Proximity algorithms for image models: denoising,” Inverse Problems, vol. 27, no. 4, Article ID 045009, 30 pages, 2011.
• Y. Nesterov, “Gradient methods for minimizing composite functions,” Mathematical Programming, vol. 140, no. 1, pp. 125–161, 2013.
• W. Wang, J. J. Gao, and S. H. Li, “Approximate inexact accelerat-ed proximal gradient method for the minimization problem of a class of maximum eigenvalue functions,” Journal of Liaoning Normal University, vol. 36, no. 3, pp. 314–317, 2013.
• X. Chen, H. Qi, L. Qi, and K.-L. Teo, “Smooth convex approximation to the maximum eigenvalue function,” Journal of Global Optimization, vol. 30, no. 2-3, pp. 253–270, 2004.
• Y. Nesterov, “Smooth minimization of non-smooth functions,” Mathematical Programming, vol. 103, no. 1, pp. 127–152, 2005.
• K. Jiang, D. Sun, and K. Toh, “An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP,” SIAM Journal on Optimization, vol. 22, no. 3, pp. 1042–1064, 2012. \endinput