Trees | Indices | Help |
|
---|
|
Generic minimisation function for easy access to all of the optimization algorithms.
This file is part of the minfx optimisation library.
|
|||
|
|
|||
SA_flag = True
|
|||
__package__ =
|
Imports: match, search, bfgs, cauchy_point, coordinate_descent, dogleg, MinfxError, exact_trust_region, fletcher_reeves, hestenes_stiefel, levenberg_marquardt, log_barrier_function, method_of_multipliers, ncg, newton, polak_ribiere, polak_ribiere_plus, simplex, steepest_descent, steihaug, anneal
|
Generic minimisation function. This is a generic function which can be used to access all minimisers using the same set of function arguments. These are the function tolerance value for convergence tests, the maximum number of iterations, a flag specifying which data structures should be returned, and a flag specifying the amount of detail to print to screen. Minimisation outputThe following values of the 'full_output' flag will return, in tuple form, the following data:
where the data names correspond to:
Minimisation algorithmsA minimisation function is selected if the minimisation algorithm argument, which should be a string, matches a certain pattern. Because the python regular expression 'match' statement is used, various strings can be supplied to select the same minimisation algorithm. Below is a list of the minimisation algorithms available together with the corresponding patterns. This is a short description of python regular expression, for more information, see the regular expression syntax section of the Python Library Reference. Some of the regular expression syntax used in this function is:
To select a minimisation algorithm, set the argument to a string which matches the given pattern. Unconstrained line search methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Back-and-forth coordinate descent | '^[Cc][Dd]$' or '^[Cc]oordinate[ _-][Dd]escent$' | | | | | Steepest descent | '^[Ss][Dd]$' or '^[Ss]teepest[ _-][Dd]escent$' | | | | | Quasi-Newton BFGS | '^[Bb][Ff][Gg][Ss]$' | | | | | Newton | '^[Nn]ewton$' | | | | | Newton-CG | '^[Nn]ewton[ _-][Cc][Gg]$' or '^[Nn][Cc][Gg]$' | |___________________________________|_____________________________________________________| Unconstrained trust-region methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Cauchy point | '^[Cc]auchy' | | | | | Dogleg | '^[Dd]ogleg' | | | | | CG-Steihaug | '^[Cc][Gg][-_ ][Ss]teihaug' or '^[Ss]teihaug' | | | | | Exact trust region | '^[Ee]xact' | |___________________________________|_____________________________________________________| Unconstrained conjugate gradient methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Fletcher-Reeves | '^[Ff][Rr]$' or '^[Ff]letcher[-_ ][Rr]eeves$' | | | | | Polak-Ribiere | '^[Pp][Rr]$' or '^[Pp]olak[-_ ][Rr]ibiere$' | | | | | Polak-Ribiere + | '^[Pp][Rr]\+$' or '^[Pp]olak[-_ ][Rr]ibiere\+$' | | | | | Hestenes-Stiefel | '^[Hh][Ss]$' or '^[Hh]estenes[-_ ][Ss]tiefel$' | |___________________________________|_____________________________________________________| Miscellaneous unconstrained methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Simplex | '^[Ss]implex$' | | | | | Levenberg-Marquardt | '^[Ll][Mm]$' or '^[Ll]evenburg-[Mm]arquardt$' | |___________________________________|_____________________________________________________| Constrained methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Method of Multipliers | '^[Mm][Oo][Mm]$' or '[Mm]ethod of [Mm]ultipliers$' | | | | | Logarithmic barrier function | 'Log barrier' | |___________________________________|_____________________________________________________| Global minimisation methods: ___________________________________________________________________________________________ | | | | Minimisation algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Simulated Annealing | '^[Ss][Aa]$' or '^[Ss]imulated [Aa]nnealing$' | |___________________________________|_____________________________________________________| Minimisation optionsThe minimisation options can be given in any order. Line search algorithms. These are used in the line search methods and the conjugate gradient methods. The default is the Backtracking line search. The algorithms are: ___________________________________________________________________________________________ | | | | Line search algorithm | Patterns | |___________________________________|_____________________________________________________| | | | | Backtracking line search | '^[Bb]ack' | | | | | Nocedal and Wright interpolation | '^[Nn][Ww][Ii]' or | | based line search | '^[Nn]ocedal[ _][Ww]right[ _][Ii]nt' | | | | | Nocedal and Wright line search | '^[Nn][Ww][Ww]' or | | for the Wolfe conditions | '^[Nn]ocedal[ _][Ww]right[ _][Ww]olfe' | | | | | More and Thuente line search | '^[Mm][Tt]' or '^[Mm]ore[ _][Tt]huente$' | | | | | No line search | '^[Nn]o [Ll]ine [Ss]earch$' | |___________________________________|_____________________________________________________| Hessian modifications. These are used in the Newton, Dogleg, and Exact trust region algorithms: ___________________________________________________________________________________________ | | | | Hessian modification | Patterns | |___________________________________|_____________________________________________________| | | | | Unmodified Hessian | '^[Nn]o [Hh]essian [Mm]od' | | | | | Eigenvalue modification | '^[Ee]igen' | | | | | Cholesky with added multiple of | '^[Cc]hol' | | the identity | | | | | | The Gill, Murray, and Wright | '^[Gg][Mm][Ww]$' | | modified Cholesky algorithm | | | | | | The Schnabel and Eskow 1999 | '^[Ss][Ee]99' | | algorithm | | |___________________________________|_____________________________________________________| Hessian type, these are used in a few of the trust region methods including the Dogleg and Exact trust region algorithms. In these cases, when the Hessian type is set to Newton, a Hessian modification can also be supplied as above. The default Hessian type is Newton, and the default Hessian modification when Newton is selected is the GMW algorithm: ___________________________________________________________________________________________ | | | | Hessian type | Patterns | |___________________________________|_____________________________________________________| | | | | Quasi-Newton BFGS | '^[Bb][Ff][Gg][Ss]$' | | | | | Newton | '^[Nn]ewton$' | |___________________________________|_____________________________________________________| For Newton minimisation, the default line search algorithm is the More and Thuente line search, while the default Hessian modification is the GMW algorithm.
|
Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Wed Apr 10 13:30:30 2013 | http://epydoc.sourceforge.net |