Article ID: | iaor20002365 |
Country: | United States |
Volume: | 10 |
Issue: | 4 |
Start Page Number: | 587 |
End Page Number: | 637 |
Publication Date: | Apr 1999 |
Journal: | Optimization Methods & Software |
Authors: | Grippo L., Sciandrone M. |
In this paper we define new classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function. More specifically, we first describe conceptual models of decomposition algorithms based on the interconnection of elementary operations performed on the block components of the variable vector. Then we characterize the elementary operations defined through a suitable line search or the global minimization in a component subspace. Using these models, we establish new results on the convergence of the nonlinear Gauss–Seidel method and we prove that this method with a two-block decomposition is globally convergent towards stationary points, even in the absence of convexity or uniqueness assumptions. In the general case of nonconvex objective function and arbitrary decomposition we define new globally convergent line-search-based schemes that may also include partial global minimizations with respect to some component. Computational aspects are discussed and, in particular, an application to a learning problem in a Radial Basis Function neural network is illustrated.