## The minfx package

To minimise target functions within relax, the minfx optimisation library is used (https://sourceforge.net/projects/minfx/). This Python package is bundled with the official relax distribution archives. If you are using a version of relax checked out directly from the source code repository, you will need to manually install minfx as a standard Python package.

The minfx library originated as one of relax's packages, but has been spun off as its own project for the benefit of other scientific, analytical, or numerical projects. Minfx is complete, very stable, well tested. Numerous optimisation algorithms are supported and can be clustered into three major categories - the line search methods, the trust-region methods, and the conjugate gradient methods.

The supported line search methods include:

• Steepest descent,
• Back-and-forth coordinate descent,
• Quasi-Newton BFGS,
• Newton,
• Newton-CG.

The supported trust-region methods include:

• Cauchy point,
• Dogleg,
• CG-Steihaug,
• Exact trust region.

The supported conjugate gradient methods include:

• Fletcher-Reeves,
• Polak-Ribière,
• Polak-Ribière +,
• Hestenes-Stiefel.

In addition, the following miscellaneous algorithms are implemented:

• Grid search,
• Levenberg-Marquardt.

The step selection subalgorithms include:

• Backtracking line search,
• Nocedal and Wright interpolation based line search,
• Nocedal and Wright line search for the Wolfe conditions,
• More and Thuente line search,
• No line search.

The Hessian modification subalgorithms include:

• Unmodified Hessian,
• Eigenvalue modification,
• Cholesky with added multiple of the identity,
• The Gill, Murray, and Wright modified Cholesky algorithm,
• The Schnabel and Eskow 1999 algorithm.

All methods can be constrained by:

• The Method of Multipliers (also known as the Augmented Lagrangian),
• The logarithmic barrier function.

These lists may be out of date, so please see the minfx website for additional information.

The relax user manual (PDF), created 2020-08-26.