Solve system of nonlinear equations

Nonlinear system solver

Solves a problem specified by

*F*(*x*) = 0

for *x*, where *F*(*x*)
is a function that returns a vector value.

*x* is a vector or a matrix; see Matrix Arguments.

solves
the equations with the optimization options specified in `x`

= fsolve(`fun`

,`x0`

,`options`

)`options`

.
Use `optimoptions`

to set these
options.

solves `x`

= fsolve(`problem`

)`problem`

,
where `problem`

is a structure described in Input Arguments. Create the `problem`

structure
by exporting a problem from Optimization app, as described in Exporting Your Work.

The function to be solved must be continuous.

When successful,

`fsolve`

only gives one root.The default trust-region dogleg method can only be used when the system of equations is square, i.e., the number of equations equals the number of unknowns. For the Levenberg-Marquardt method, the system of equations need not be square.

For large problems, meaning those with thousands of variables or more, save memory (and possibly save time) by setting the

`Algorithm`

option to`'trust-region'`

and the`SubproblemAlgorithm`

option to`'cg'`

.

The Levenberg-Marquardt and trust-region methods are based on
the nonlinear least-squares algorithms also used in `lsqnonlin`

. Use one of these methods if
the system may not have a zero. The algorithm still returns a point
where the residual is small. However, if the Jacobian of the system
is singular, the algorithm might converge to a point that is not a
solution of the system of equations (see Limitations).

By default

`fsolve`

chooses the trust-region dogleg algorithm. The algorithm is a variant of the Powell dogleg method described in [8]. It is similar in nature to the algorithm implemented in [7]. See Trust-Region-Dogleg Algorithm.The trust-region algorithm is a subspace trust-region method and is based on the interior-reflective Newton method described in [1] and [2]. Each iteration involves the approximate solution of a large linear system using the method of preconditioned conjugate gradients (PCG). See Trust-Region Algorithm.

The Levenberg-Marquardt method is described in references [4], [5], and [6]. See Levenberg-Marquardt Method.

[1] Coleman, T.F. and Y. Li, “An Interior,
Trust Region Approach for Nonlinear Minimization Subject to Bounds,” *SIAM
Journal on Optimization*, Vol. 6, pp. 418-445, 1996.

[2] Coleman, T.F. and Y. Li, “On the
Convergence of Reflective Newton Methods for Large-Scale Nonlinear
Minimization Subject to Bounds,” *Mathematical Programming*,
Vol. 67, Number 2, pp. 189-224, 1994.

[3] Dennis, J. E. Jr., “Nonlinear Least-Squares,” *State
of the Art in Numerical Analysis*, ed. D. Jacobs, Academic
Press, pp. 269-312.

[4] Levenberg, K., “A Method for the
Solution of Certain Problems in Least-Squares,” *Quarterly
Applied Mathematics 2*, pp. 164-168, 1944.

[5] Marquardt, D., “An Algorithm for
Least-squares Estimation of Nonlinear Parameters,” *SIAM
Journal Applied Mathematics*, Vol. 11, pp. 431-441, 1963.

[6] Moré, J. J., “The Levenberg-Marquardt
Algorithm: Implementation and Theory,” *Numerical
Analysis*, ed. G. A. Watson, Lecture Notes in Mathematics
630, Springer Verlag, pp. 105-116, 1977.

[7] Moré, J. J., B. S. Garbow, and K.
E. Hillstrom, *User Guide for MINPACK 1*, Argonne
National Laboratory, Rept. ANL-80-74, 1980.

[8] Powell, M. J. D., “A Fortran Subroutine
for Solving Systems of Nonlinear Algebraic Equations,” *Numerical
Methods for Nonlinear Algebraic Equations*, P. Rabinowitz,
ed., Ch.7, 1970.