1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24 """Linear inequality constraint functions and gradients.
25
26 The constraints are in the form::
27
28 A.x >= b
29
30 This file is part of the U{minfx optimisation library<https://sourceforge.net/projects/minfx>}.
31 """
32
33
34 from numpy import dot
35
36
39 """Class for the creation of linear inequality constraint functions and gradients.
40
41 The constraints are in the form::
42
43 A.x >= b
44
45 where:
46
47 - A is an m*n matrix where the rows are the transposed vectors, ai, of length n. The elements of ai are the coefficients of the model parameters.
48 - x is the vector of model parameters of dimension n.
49 - b is the vector of scalars of dimension m.
50 - m is the number of constraints.
51 - n is the number of model parameters.
52
53 E.g. if 0 <= q <= 1, q >= 1 - 2r, and 0 <= r, then::
54
55 | 1 0 | | 0 |
56 | | | |
57 |-1 0 | | q | | -1 |
58 | | . | | >= | |
59 | 1 2 | | r | | 1 |
60 | | | |
61 | 0 1 | | 2 |
62 """
63
64
65 self.A = A
66 self.b = b
67
68
70 """The constraint function.
71
72 A vector containing the constraint values is returned.
73 """
74
75 return dot(self.A, x) - self.b
76
77
79 """The constraint gradient.
80
81 As the inequality constraints are linear, the gradient matrix is constant and equal to the coefficient matrix A. Therefore this function simply returns the matrix.
82 """
83
84 return self.A
85