# A space decomposition scheme for maximum eigenvalue functions and its applications

In this paper, we study nonlinear optimization problems involving eigenvalues of symmetric matrices. One of the difficulties in solving these problems is that the eigenvalue functions are not differentiable when the multiplicity of the function is not one. We apply the $\mathcal{U}$ ‐Lagrangian theory to analyze the largest eigenvalue function of a convex matrix‐valued mapping which extends the corresponding results for linear mapping in the literature. We also provides the formula of first‐and second‐order derivatives of the $\mathcal{U}$ ‐Lagrangian under mild assumptions. These theoretical results provide us new second‐order information about the largest eigenvalue function along a suitable smooth manifold, and leads to a new algorithmic framework for analyzing the underlying optimization problem.