Method of Multipliers algorithm

One of the most powerful approaches is the Method of Multipliers (Nocedal and Wright, 1999), also known as the Augmented Lagrangian.
Instead of a single optimisation the algorithm is iterative with each iteration consisting of an independent unconstrained minimisation on a sequentially modified space.
When inside the limits the function value is unchanged but when outside a penalty, which is proportional to the distance outside the limit, is added to the function value.
This penalty, which is based on the Lagrange multipliers, is smooth and hence the gradient and Hessian are continuous at and beyond the constraints.
For each iteration of the Method of Multipliers the penalty is increased until it becomes impossible for the parameter vector to be in violation of the limits.
This approach allows the parameter vector *θ* outside the limits yet the successive iterations ensure that the final results will not be in violation of the constraint.

For inequality constraints, each iteration of the Method of Multipliers attempts to solve the quadratic sub-problem

where the function *Ψ* is defined as

Ψ(c_{i}(θ), λ^{k};μ_{k}) = |
(14.12) |

In (14.11), *θ* is the parameter vector;
is the Augmented Lagrangian function; *k* is the current iteration of the Method of Multipliers; *λ*^{k} are the Lagrange multipliers which are positive factors such that, at the minimum
,
∇*f* () = *λ*_{i}∇*c*_{i}(); *μ*_{k} > 0 is the penalty parameter which decreases to zero as
*k*→∞;
is the set of inequality constraints; and
*c*_{i}(*θ*) is an individual constraint value.
The Lagrange multipliers are updated using the formula

λ_{i}^{k+1} = max(λ_{i}^{k} - c_{i}(θ)/μ_{k}, 0), foralli∈ |
(14.13) |

The gradient of the Augmented Lagrangian is

∇ | (14.14) |

and the Hessian is

∇^{2} |
(14.15) |

The Augmented Lagrangian algorithm can accept any set of three arbitrary constraint functions *c*(*θ*),
∇*c*(*θ*), and
∇^{2}*c*(*θ*).
When given the current parameter values *c*(*θ*) returns a vector of constraint values whereby each position corresponds to one of the model parameters.
The constraint is defined as
*c*_{i} 0.
The function
∇*c*(*θ*) returns the matrix of constraint gradients and
∇^{2}*c*(*θ*) is the constraint Hessian function which should return the 3D matrix of constraint Hessians.

A more specific set of constraints accepted by the Method of Multipliers are bound constraints. These are defined by the function

l θ u, |
(14.16) |

where *l* and *u* are the vectors of lower and upper bounds respectively and *θ* is the parameter vector.
For example for model-free model *m*4 to place lower and upper bounds on the order parameter and lower bounds on the correlation time and chemical exchange parameters, the vectors are

. | (14.17) |

The default setting in the program relax is to use linear constraints which are defined as

where *A* is an
*m*×*n* matrix where the rows are the transposed vectors *a*_{i} of length *n*; the elements of *a*_{i} are the coefficients of the model parameters; *θ* is the vector of model parameters of dimension *n*; *b* is the vector of scalars of dimension *m*; *m* is the number of constraints; and *n* is the number of model parameters.

In rearranging (14.18) the linear constraint function *c*(*θ*) returns the vector
*A*⋅*θ* - *b*.
Because of the linearity of the constraints the gradient and Hessian are greatly simplified.
The gradient
∇*c*(*θ*) is simply the matrix *A* and the Hessian
∇^{2}*c*(*θ*) is zero.

The relax user manual (PDF), created 2019-03-08.