mailRe: r25491 - /trunk/specific_analyses/relax_disp/data.py


Others Months | Index by Date | Thread Index
>>   [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Header


Content

Posted by Edward d'Auvergne on September 01, 2014 - 12:43:
On 1 September 2014 12:34, Troels Emtekær Linnet <tlinnet@xxxxxxxxxxxxx> 
wrote:
Anyway, before minfx can handle constraints in for example BFGS,
this is just a waste of time.

Minfx can do this :)  The log-barrier constraint algorithm works with
all optimisation techniques in minfx, well, apart from the grid search
(https://en.wikipedia.org/wiki/Barrier_function#Logarithmic_barrier_function).
And if gradients are supplied, the more powerful
Methods-of-Multipliers algorithm can also be used in combination with
all optimisation techniques
(https://en.wikipedia.org/wiki/Augmented_Lagrangian_method).


I think there will be a 10 x speed up, just for the Jacobian.

For the analytic models, you could have a 10x speed up if symbolic
gradients and Hessians are implemented.  I'm guessing that's what you
mean.


And when you have the Jacobian, estimating the errors are trivial.

std(q) = sqrt ( (dq/dx std(x))*2 + (dq/dz std(z))*2 )

:S  I'm not sure about this estimate.  It looks rather too linear.  I
wish errors would be so simple.


where q is the function. x and z are R1 and R1rho_prime.

So, until then, implementing the Jacobian is only for testing the
error estimation compared to
Monte-Carlo simulations.

If you do add the equations, the lib.dispersion.dpl94 module would be
the natural place to put them.  And the interface as dfunc_DPL94(),
d2func_DPL94(), and jacobian_DPL94().

Regards,

Edward



Related Messages


Powered by MHonArc, Updated Mon Sep 01 13:20:10 2014