- Global search and multistart solvers for finding single or multiple global optima
- Genetic algorithm for linear, nonlinear, bound, and integer constraints with customization by defining parent selection, crossover, and mutation functions
- Multiobjective genetic algorithm with Pareto-front identification, including linear, nonlinear, and bound constraints
- Pattern search solver for linear, nonlinear, and bound constraints with customization by defining polling, searching, and other functions
- Simulated annealing solver for bound constraints, with options for defining annealing process, temperature schedule, and acceptance criteria
- Particle swarm solver for bound constraints with options for defining the initial particles and swarm behavior

Global Optimization Toolbox provides functions that you can access from the command line and from the Optimization app in Optimization Toolbox™. Both the command line and app let you:

- Select a solver and define an optimization problem
- Set and inspect optimization options
- Run optimization problems and visualize intermediate and final results
- Use Optimization Toolbox solvers to refine genetic algorithm, simulated annealing, and pattern search results
- Import and export optimization problems and results to your MATLAB
^{®}workspace - Capture and reuse work performed in the Optimization app using MATLAB code generation

You can also customize the solvers by providing your own algorithm options and custom functions. Multistart and global search solvers are accessible only from the command line.

The toolbox includes a number of plotting functions for visualizing an optimization. These visualizations give you live feedback about optimization progress, enabling you to make decisions to modify some solver options or stop the solver. The toolbox provides custom plotting functions for both the genetic algorithm and pattern search algorithms. They include objective function value, constraint violation, score histogram, genealogy, mesh size, and function evaluations. You can show multiple plots together, open specific plots in a new window for closer examination, or add your own plotting functions.

Using the output function, you can write results to files, create your own stopping criteria, and write your own apps to run toolbox solvers. When working from the Optimization app, you can export the problem and algorithm options to the MATLAB workspace, save your work and reuse it at a later time, or generate MATLAB code that captures the work you’ve done.

While an optimization is running, you can change some options to refine the solution and update performance results in genetic algorithm, multiobjective genetic algorithm, simulated annealing, and pattern search solvers. For example, you can enable or disable plot functions, output functions, and command-line iterative display during run time to view intermediate results and query solution progress, without the need to stop and restart the solver. You can also modify stopping conditions to refine the solution progression or reduce the number of iterations required to achieve a desired tolerance based upon run-time performance feedback.

Explore gallery (3 images)

The global search and multistart solvers use gradient-based methods to return local and global minima. Both solvers start a local solver (in Optimization Toolbox) from multiple starting points and store local and global solutions found during the search process.

The global search solver:

- Uses a scatter-search algorithm to generate multiple starting points
- Filters nonpromising start points based upon objective and constraint function values and local minima already found
- Runs a constrained nonlinear optimization solver to search for a local minimum from the remaining start points

The multistart solver uses either uniformly distributed start points within predefined bounds or user-defined start points to find multiple local minima, including a single global minimum if one exists. The multistart solver runs the local solver from all starting points and can be run in serial or in parallel (using Parallel Computing Toolbox™). The multistart solver also provides flexibility in choosing different local nonlinear solvers. The available local solvers include unconstrained nonlinear, constrained nonlinear, nonlinear least-squares, and nonlinear least-squares curve fitting.

The genetic algorithm solves optimization problems by mimicking the principles of biological evolution, repeatedly modifying a population of individual points using rules modeled on gene combinations in biological reproduction. Due to its random nature, the genetic algorithm improves your chances of finding a global solution. It enables you to solve unconstrained, bound-constrained, and general optimization problems, and it does not require the functions to be differentiable or continuous.

The following table shows the standard genetic algorithm options provided by Global Optimization Toolbox.

Step | Genetic Algorithm Option |
---|---|

Creation | Uniform, feasible |

Fitness scaling | Rank-based, proportional, top (truncation), shift linear |

Selection | Roulette, stochastic uniform selection (SUS), tournament, uniform, remainder |

Crossover | Arithmetic, heuristic, intermediate, scattered, single-point, two-point |

Mutation | Adaptive feasible, Gaussian, uniform |

Plotting | Best fitness, best individual, distance among individuals, diversity of population, expectation of individuals, max constraint, range, selection index, stopping conditions |

Global Optimization Toolbox also lets you specify:

- Population size
- Number of elite children
- Crossover fraction
- Migration among subpopulations (using ring topology)
- Bounds, linear, and nonlinear constraints for an optimization problem

You can customize these algorithm options by providing user-defined functions and represent the problem in a variety of data formats, for example by defining variables that are integers, mixed integers, categorical, or complex.

You can base the stopping criteria for the algorithm on time, stalling, fitness limit, or number of generations. And you can vectorize your fitness function to improve execution speed or execute the objective and constraint functions in parallel (using Parallel Computing Toolbox).

Multiobjective optimization is concerned with the minimization of multiple objective functions that are subject to a set of constraints. The multiobjective genetic algorithm solver is used to solve multiobjective optimization problems by identifying the Pareto front—the set of evenly distributed nondominated optimal solutions. You can use this solver to solve smooth or nonsmooth optimization problems with or without bound and linear constraints. The multiobjective genetic algorithm does not require the functions to be differentiable or continuous.

The following table shows the standard multiobjective genetic algorithm options provided by Global Optimization Toolbox.

Step | Multiobjective Genetic Algorithm Option |
---|---|

Creation | Uniform, feasible |

Fitness scaling | Rank-based, proportional, top (truncation), linear scaling, shift |

Selection | Tournament |

Crossover | Arithmetic, heuristic, intermediate, scattered, single-point, two-point |

Mutation | Adaptive feasible, Gaussian, uniform |

Plotting | Average Pareto distance, average Pareto spread, distance among individuals, diversity of population, expectation of individuals, Pareto front, rank histogram, selection index, stopping conditions |

Global Optimization Toolbox also lets you specify:

- Population size
- Crossover fraction
- Pareto fraction
- Distance measure across individuals
- Migration among subpopulations (using ring topology)
- Linear and bound constraints for an optimization problem

You can customize these algorithm options by providing user-defined functions and represent the problem in a variety of data formats, for example by defining variables that are integers, mixed integers, categorical, or complex.

You can base the stopping criteria for the algorithm on time, fitness limit, or number of generations. And you can vectorize your fitness function to improve execution speed or execute the objective functions in parallel (using Parallel Computing Toolbox).

Global Optimization Toolbox contains three direct search algorithms: generalized pattern search (GPS), generating set search (GSS), and mesh adaptive search (MADS). While more traditional optimization algorithms use exact or approximate information about the gradient or higher derivatives to search for an optimal point, these algorithms use a pattern search method that implements a minimal and maximal positive basis pattern. The pattern search method handles optimization problems with nonlinear, linear, and bound constraints, and does not require functions to be differentiable or continuous.

The following table shows the pattern search algorithm options provided by Global Optimization Toolbox. You can change any of the options from the command line or the Optimization Tool.

Pattern Search Option | Description |
---|---|

Polling methods | Decide how to generate and evaluate the points in a pattern and the maximum number of points generated at each step. You can also control the polling order of the points to improve efficiency. |

Search methods | Choose an optional search step that may be more efficient than a poll step. You can perform a search in a pattern or in the entire search space. Global search methods, like the genetic algorithm, can be used to obtain a good starting point. |

Mesh | Control how the pattern changes over iterations and adjusts the mesh for problems that vary in scale across dimensions. You can choose the initial mesh size, mesh refining factor, or mesh contraction factor. The mesh accelerator speeds up convergence when it is near a minimum. |

Cache | Store points evaluated during optimization of computationally expensive objective functions. You can specify the size and tolerance of the cache that the pattern search algorithm uses and vary the cache tolerance as the algorithm proceeds, improving optimization speed and efficiency. |

Nonlinear constraint algorithm settings | Specify a penalty parameter for the nonlinear constraints as well as a penalty update factor. |

Simulated annealing solves optimization problems using a probabilistic search algorithm that mimics the physical process of annealing, in which a material is heated and then the temperature is slowly lowered to decrease defects, thus minimizing the system energy. By analogy, each iteration of a simulated annealing algorithm seeks to improve the current minimum by slowly reducing the extent of the search.

The simulated annealing algorithm accepts all new points that lower the objective, but also, with a certain probability, points that raise the objective. By accepting points that raise the objective, the algorithm avoids being trapped in local minima in early iterations and is able to explore globally for better solutions.

Simulated annealing allows you to solve unconstrained or bound-constrained optimization problems and does not require that the functions be differentiable or continuous. From the command line or Optimization app you can use toolbox functions to:

- Solve problems using adaptive simulated annealing, Boltzmann annealing, or fast annealing algorithms
- Create custom functions to define the annealing process, acceptance criteria, temperature schedule, plotting functions, simulation output, or custom data types
- Perform hybrid optimization by specifying another optimization method to run at defined intervals or at normal solver termination

You can use Global Optimization Toolbox in conjunction with Parallel Computing Toolbox to solve problems that benefit from parallel computation. By using built-in parallel computing capabilities or defining a custom parallel computing implementation of your optimization problem, you decrease time to solution.

Built-in support for parallel computing accelerates the objective and constraint function evaluation in genetic algorithm, multiobjective genetic algorithm, and pattern search solvers. You can accelerate the multistart solver by distributing the multiple local solver calls across multiple MATLAB workers or by enabling the parallel gradient estimation in the local solvers.

A custom parallel computing implementation involves explicitly defining the optimization problem to use parallel computing functionality. You can define either your objective function or constraint function to use parallel computing, letting you decrease the time required to evaluate the objective or constraint.