Several optimization solvers accept nonlinear constraints, including `fmincon`

, `fseminf`

, `fgoalattain`

, `fminimax`

, and the Global Optimization
Toolbox solvers `ga`

, `gamultiobj`

, `patternsearch`

, `paretosearch`

,
`GlobalSearch`

, and `MultiStart`

. Nonlinear constraints allow you to restrict the solution to
any region that can be described in terms of smooth functions.

Nonlinear inequality constraints have the form *c*(*x*) ≤ 0, where *c* is
a vector of constraints, one component for each constraint. Similarly,
nonlinear equality constraints are of the form *ceq*(*x*) = 0.

Nonlinear constraint functions must return both `c`

and `ceq`

,
the inequality and equality constraint functions, even if they do
not both exist. Return an empty entry `[]`

for a
nonexistent constraint.

For example, suppose that you have the following inequalities as constraints:

$$\begin{array}{c}\frac{{x}_{1}^{2}}{9}+\frac{{x}_{2}^{2}}{4}\le 1,\\ {x}_{2}\ge {x}_{1}^{2}-1.\end{array}$$

Write these constraints in a function file as follows:

function [c,ceq]=ellipseparabola(x) c(1) = (x(1)^2)/9 + (x(2)^2)/4 - 1; c(2) = x(1)^2 - x(2) - 1; ceq = []; end

`ellipseparabola`

returns an
empty entry `[]`

for `ceq`

, the
nonlinear equality function. Also, both inequalities were put into
≤ 0 form.Minimize the function `exp(x(1) + 2*x(2))`

subject to the
`ellipseparabola`

constraints.

```
fun = @(x)exp(x(1) + 2*x(2));
nonlcon = @ellipseparabola;
x0 = [0 0];
A = []; % No other constraints
b = [];
Aeq = [];
beq = [];
lb = [];
ub = [];
x = fmincon(fun,x0,A,b,Aeq,beq,lb,ub,nonlcon)
```

Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the value of the optimality tolerance, and constraints are satisfied to within the value of the constraint tolerance. x = -0.2500 -0.9375

If you provide gradients for *c* and *ceq*,
your solver can run faster and give more reliable results.

Providing a gradient has another advantage. A solver can reach
a point `x`

such that `x`

is feasible,
but finite differences around `x`

always lead to
an infeasible point. In this case, a solver can fail or halt prematurely.
Providing a gradient allows a solver to proceed.

To include gradient information, write a conditionalized function as follows:

function [c,ceq,gradc,gradceq]=ellipseparabola(x) c(1) = x(1)^2/9 + x(2)^2/4 - 1; c(2) = x(1)^2 - x(2) - 1; ceq = []; if nargout > 2 gradc = [2*x(1)/9, 2*x(1); ... x(2)/2, -1]; gradceq = []; end

See Writing Scalar Objective Functions for information on conditionalized functions. The gradient matrix has the form

`gradc`

_{i, j} =
[∂`c`

(*j*)/∂*x _{i}*].

The first column of the gradient matrix is associated with `c(1)`

,
and the second column is associated with `c(2)`

.
This is the transpose of the form of Jacobians.

To have a solver use gradients of nonlinear constraints, indicate
that they exist by using `optimoptions`

:

options = optimoptions(@fmincon,'SpecifyConstraintGradient',true);

Make sure to pass the options structure to your solver:

[x,fval] = fmincon(@myobj,x0,A,b,Aeq,beq,lb,ub, ... @ellipseparabola,options)

If you have a Symbolic Math Toolbox™ license, you can calculate gradients and Hessians automatically, as described in Symbolic Math Toolbox Calculates Gradients and Hessians.

For information on anonymous objective functions, see Anonymous Function Objectives.

Nonlinear constraint functions must return two outputs. The first output corresponds to nonlinear inequalities, and the second corresponds to nonlinear equalities.

Anonymous functions return just one output. So how can you write an anonymous function as a nonlinear constraint?

The `deal`

function distributes
multiple outputs. For example, suppose your nonlinear inequalities
are

$$\begin{array}{c}\frac{{x}_{1}^{2}}{9}+\frac{{x}_{2}^{2}}{4}\le 1,\\ {x}_{2}\ge {x}_{1}^{2}-1.\end{array}$$

Suppose that your nonlinear equality is

*x*_{2} = tanh(*x*_{1}).

Write a nonlinear constraint function as follows:

c = @(x)[x(1)^2/9 + x(2)^2/4 - 1; x(1)^2 - x(2) - 1]; ceq = @(x)tanh(x(1)) - x(2); nonlinfcn = @(x)deal(c(x),ceq(x));

To minimize the function cosh(*x*_{1}) + sinh(*x*_{2}) subject
to the constraints in `nonlinfcn`

, use `fmincon`

:

obj = @(x)cosh(x(1))+sinh(x(2)); opts = optimoptions(@fmincon,'Algorithm','sqp'); z = fmincon(obj,[0;0],[],[],[],[],[],[],nonlinfcn,opts) Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance. z = -0.6530 -0.5737

To check how well the resulting point `z`

satisfies
the constraints, use `nonlinfcn`

:

[cout,ceqout] = nonlinfcn(z) cout = -0.8704 0 ceqout = 0

`z`

indeed satisfies all the constraints to
within the default value of the `ConstraintTolerance`

constraint
tolerance, `1e-6`

.

`GlobalSearch`

| `MultiStart`

| `fgoalattain`

| `fmincon`

| `ga`

| `patternsearch`