Solution of nonlinear least-squares problems

Abstract : This dissertation addresses the nonlinear least-squares problem where f(x) is a vector whose components are smooth nonlinear functions. The problem arises most often in data fitting applications. Much research has focused on the development of specialized algorithms that attempt to exploit the structure of the nonlinear least-squares objective. We assume that n and m are relatively small, so that limited storage and sparsity in the derivatives of f need not be taken into account in formulating algorithms. We first discuss existing numerical algorithms for nonlinear least squares, nearly all of which involve iterative minimization of quadratic function. Methods for general unconstrained optimization, Gauss-Newton methods, Levenberg-Marquardt methods, and special quasi-Newton methods are among the algorithms surveyed. Our emphasis is on those methods that form the basis of widely-distributed software, and numerical results are given for a large set of test problems. The main contribution of this research is to propose new algorithms that make use of more general quadratic programming subproblems. Options are investigated that are based on convergence properties of sequential quadratic programming methods for constrained optimization, and on geometric considerations in nonlinear least squares. Numerical results are given, demonstrating that the new methods may be useful in practice.