Abstract
In order to minimize a closed convex function that is approximated by a sequence of better behaved functions, we investigate the global convergence of a general hybrid iterative algorithm, which consists of an inexact relaxed proximal point step followed by a suitable orthogonal projection onto a hyperplane. The latter permits to consider a fixed relative error criterion for the proximal step. We provide various sets of conditions ensuring the global convergence of this algorithm. The analysis is valid for nonsmooth data in infinite-dimensional Hilbert spaces. Some examples are presented, focusing on penalty/barrier methods in convex programming. We also show that some results can be adapted to the zero-finding problem for a maximal monotone operator.
| Original language | English |
|---|---|
| Pages (from-to) | 966-984 |
| Number of pages | 19 |
| Journal | Mathematics of Operations Research |
| Volume | 30 |
| Issue number | 4 |
| DOIs | |
| State | Published - Nov 2005 |
| Externally published | Yes |
Keywords
- Diagonal iteration
- Global convergence
- Hybrid method
- Parametric approximation
- Proximal point