Journal of Applied Mathematics

A Relax Inexact Accelerated Proximal Gradient Method for the Constrained Minimization Problem of Maximum Eigenvalue Functions

Wei Wang, Shanghua Li, and Jingjing Gao

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


For constrained minimization problem of maximum eigenvalue functions, since the objective function is nonsmooth, we can use the approximate inexact accelerated proximal gradient (AIAPG) method (Wang et al., 2013) to solve its smooth approximation minimization problem. When we take the function g ( X ) = δ Ω ( X ) ( Ω = { X S n : F ( X ) = b , X 0 } ) in the problem min { λ max ( X ) + g ( X ) : X S n } , where λ max ( X ) is the maximum eigenvalue function, g ( X ) is a proper lower semicontinuous convex function (possibly nonsmooth) and δ Ω ( X ) denotes the indicator function. But the approximate minimizer generated by AIAPG method must be contained in Ω otherwise the method will be invalid. In this paper, we will consider the case where the approximate minimizer cannot be guaranteed in Ω . Thus we will propose two different strategies, respectively, constructing the feasible solution and designing a new method named relax inexact accelerated proximal gradient (RIAPG) method. It is worth mentioning that one advantage when compared to the former is that the latter strategy can overcome the drawback. The drawback is that the required conditions are too strict. Furthermore, the RIAPG method inherits the global iteration complexity and attractive computational advantage of AIAPG method.

Article information

J. Appl. Math., Volume 2014 (2014), Article ID 749475, 7 pages.

First available in Project Euclid: 2 March 2015

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)


Wang, Wei; Li, Shanghua; Gao, Jingjing. A Relax Inexact Accelerated Proximal Gradient Method for the Constrained Minimization Problem of Maximum Eigenvalue Functions. J. Appl. Math. 2014 (2014), Article ID 749475, 7 pages. doi:10.1155/2014/749475.

Export citation


  • E. S. Mistakidis and G. E. Stavroulakis, Nonconvex Optimization in Mechanics. Smooth and Nonsmooth Algotithms, Heuristics and Engineering Applications, Kluwer Academic Publisher, Dordrechet, The Netherlands, 1998.
  • P. L. Combettes and V. R. Wajs, “Signal recovery by proximal forward-backward splitting,” Multiscale Modeling & Simulation, vol. 4, no. 4, pp. 1168–1200, 2005.
  • C. A. Micchelli, L. Shen, and Y. Xu, “Proximity algorithms for image models: denoising,” Inverse Problems, vol. 27, no. 4, Article ID 045009, 30 pages, 2011.
  • Y. Nesterov, “Gradient methods for minimizing composite functions,” Mathematical Programming, vol. 140, no. 1, pp. 125–161, 2013.
  • W. Wang, J. J. Gao, and S. H. Li, “Approximate inexact accelerat-ed proximal gradient method for the minimization problem of a class of maximum eigenvalue functions,” Journal of Liaoning Normal University, vol. 36, no. 3, pp. 314–317, 2013.
  • X. Chen, H. Qi, L. Qi, and K.-L. Teo, “Smooth convex approximation to the maximum eigenvalue function,” Journal of Global Optimization, vol. 30, no. 2-3, pp. 253–270, 2004.
  • Y. Nesterov, “Smooth minimization of non-smooth functions,” Mathematical Programming, vol. 103, no. 1, pp. 127–152, 2005.
  • K. Jiang, D. Sun, and K. Toh, “An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP,” SIAM Journal on Optimization, vol. 22, no. 3, pp. 1042–1064, 2012. \endinput