E05 Class
関数リスト一覧   NagLibrary Namespaceへ  ライブラリイントロダクション  本ヘルプドキュメントのchm形式版

Global optimization involves finding the absolute maximum or minimum value of a function (the objective function) of several variables, possibly subject to restrictions (defined by a set of bounds or constraint functions) on the values of the variables. Such problems can be much harder to solve than local optimization problems (which are discussed in E04 class) because it is difficult to determine whether a potential optimum found is global, and because of the nonlocal methods required to avoid becoming trapped near local optima. Most optimization methods in the nAG Library are concerned with function minimization only, since the problem of maximizing a given objective function F is equivalent to minimizing -F. In this chapter, you may specify whether you are solving a minimization or maximization problem; in the latter case, the required transformation of the objective function will be carried out automatically. In what follows we refer exclusively to minimization problems.
This introduction is a brief guide to the subject of global optimization, designed for the casual user. For further details you may find it beneficial to consult a more detailed text, such as Neumaier (2004). Furthermore, much of the material in the E04 class is relevant in this context also. In particular, it is strongly recommended that you read [Section Scaling] in the E04 class Chapter Introduction.

Syntax

C#
public static class E05
Visual Basic (Declaration)
Public NotInheritable Class E05
Visual C++
public ref class E05 abstract sealed
F#
[<AbstractClassAttribute>]
[<SealedAttribute>]
type E05 =  class end

Background to the Problems

Problem Formulation

Terminology

Complete Methods

A global optimization algorithm is called asymptotically complete if
(i) assuming indefinitely long run-time and exact computations, a global minimum will be found with certainty (probability one), but
(ii) the algorithm has no way of knowing when a global minimizer has been found.
In comparison, a complete method satisfies (i) as well as
(ii(a)) the algorithm is able to recognize a global minimum (to prescribed tolerances) within a finite amount of time.
It is important to appreciate that, for finding a solution exactly, bounds on the amount of work may be very pessimistic. What complete methods guarantee is the absence of any deficiency that would prevent a global minimum from eventually being found. To achieve termination with certainty in a finite amount of time, the algorithm requires access to global information about the problem. In the case where only function values are available, as in e05jb, stopping criteria based on heuristics are present. This is because such a class of method can only terminate with certainty by performing an expensive dense search.
In contrast, incomplete methods have intuitive heuristics for searching but no guarantee of not getting stuck near nonglobal, local, minima. Often, to make incomplete methods efficient, expert knowledge on the particular problem class to be solved is required. Examples of incomplete methods are genetic algorithms and simulated annealing.

Branching

Methods of Global Optimization

Multi-level Coordinate Search

The method e05jb searches for a global minimizer using branching to recursively split the search space in a nonuniform manner. It divides, or splits, the root box of the search into smaller sub-boxes. Each sub-box contains a distinguished basepoint at which the objective function is sampled. We shall sometimes say ‘the function value of the (sub)box’ as shorthand for ‘the function value of the basepoint of the (sub)box’. The splitting procedure biases the search in favour of those sub-boxes where low function values are expected.
The global part of the algorithm entails splitting sub-boxes that enclose large unexplored territory, while the local part of the algorithm entails splitting sub-boxes that have good function values. A balance between the global and local part is achieved using a multi-level approach, where every sub-box is assigned a level s0,1,,smax. You can control the value of smax using the optional parameter Splits Limit. Whenever a sub-box of intermediate level 0<s<smax is split each descendant will be given the level s+1 or mins+2,smax, and the original sub-box's level is set to 0: a sub-box with level 0 has already been split; a sub-box with level smax will be split no further. This entire process is described in more detail in [Section Initialization and Sweeps] in e05jb, where the initialization procedure used to produce an initial set of sub-boxes is outlined, and the method by which the algorithm sweeps through levels is discussed. Each sweep starts with the sub-boxes at the lowest level, a process thus forming the global part of the algorithm. At each level the sub-box with the best function value is selected for splitting; this forms the local part of the algorithm.
The process by which sub-boxes are split is explained in [Section Splitting] in e05jb. It is a variant of the standard coordinate search method: the solver splits along a single coordinate at a time, at adaptively chosen points. In most cases one new function evaluation is needed to split a sub-box into two or three children. Each child is given a basepoint chosen to differ from the basepoint of the parent in at most one coordinate, and safeguards are present to ensure a degree of symmetry in the splits.
If you set the optional parameter Local Searches to be ‘OFF’, then the basepoints and function values of sub-boxes of maximum level smax are put into a ‘shopping basket’ of candidate minima. Turning Local Searches ‘ON’ (the default setting) will enable local searches to be started from these basepoints before they go into the shopping basket. The local search will go ahead providing the basepoint is not likely to be in the basin of attraction of a previously-found local minimum. The search itself uses a trust region approach, and is explained in [Section Local Search] in e05jb: local quadratic models are built by a triple search, then a linesearch is made along the direction obtained by minimizing the quadratic on a region where it is a good approximation to the objective function. The quadratic need not be positive-definite, so the general nonlinear optimizer e04vh is used to minimize the model.

Recommendations on Choice and Use of Available Methods

e05jb is based on the multi-level coordinate search (MCS) method of Huyer and Neumaier (1999). It is an asymptotically complete method for bound constrained problems based on local information (function values) only, employing branching and local searches to accelerate convergence.
Currently there is no method in this chapter for handling constraints that are not bound constraints.

References

Inheritance Hierarchy

See Also