mailr25285 - /trunk/specific_analyses/relax_disp/estimate_r2eff.py


Others Months | Index by Date | Thread Index
>>   [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Header


Content

Posted by tlinnet on August 26, 2014 - 13:12:
Author: tlinnet
Date: Tue Aug 26 13:12:14 2014
New Revision: 25285

URL: http://svn.gna.org/viewcvs/relax?rev=25285&view=rev
Log:
Got the method of 'Steepest descent' to work properly, by specifying the 
Jacobian correctly.

The Jacobian was derived according to the chi2 function.

The key point was to evaluate the two derivative arrays for all times points, 
and
and then sum each of the two arrays together, before constructing the 
Jacobian.

This clearly shows the difference between minfx and scipy.optimize.leastsq.

scipy.optimize.leastsq takes as input a function F(x0), which should return 
the array
of weighted differences between function value and measured values:
" 1. / self.errors * (self.calc_exp(self.times, *params) - self.values) "

This will be an array with number of elements 'i' corresponding to number of 
elements.
scipy.optimize.leastsq then internally evaluates the sum of squares -> sum[ 
(O - E)**2 ],
and minimises this. This is the chi2.

Minfx requires the function to minimise before hand.
So, the "func" should be chi2.

Then the dfunc, and d2func, should be derivative of chi2, bum all elements in 
the array should
still be summed together.

task #7822(https://gna.org/task/index.php?7822): Implement user function to 
estimate R2eff and associated errors for exponential curve fitting.

Modified:
    trunk/specific_analyses/relax_disp/estimate_r2eff.py

Modified: trunk/specific_analyses/relax_disp/estimate_r2eff.py
URL: 
http://svn.gna.org/viewcvs/relax/trunk/specific_analyses/relax_disp/estimate_r2eff.py?rev=25285&r1=25284&r2=25285&view=diff
==============================================================================
--- trunk/specific_analyses/relax_disp/estimate_r2eff.py        (original)
+++ trunk/specific_analyses/relax_disp/estimate_r2eff.py        Tue Aug 26 
13:12:14 2014
@@ -134,18 +134,6 @@
         # Set algorithm.
         self.min_algor = min_algor
 
-        # Newton does not work.
-        #self.min_algor = 'newton'
-
-        # Newton-CG does not work.
-        #self.min_algor = 'Newton-CG'
-
-        # Also not work.
-        #self.min_algor = 'Steepest descent'
-
-        # Also not work.#
-        #self.min_algor = 'Fletcher-Reeves'
-
         # Define if constraints should be used.
         self.constraints = constraints
 
@@ -164,7 +152,6 @@
             self.b = array([   0., -200.,    0.])
 
         else:
-            self.min_algor = 'simplex'
             self.min_options = ()
             self.A = None
             self.b = None
@@ -305,16 +292,15 @@
 
         # Make partial derivative, with respect to r2eff.
         # d_chi2_d_r2eff = 2.0*i0*times*(-i0*exp(-r2eff*times) + 
values)*exp(-r2eff*times)/errors**2
-        d_chi2_d_r2eff = 2.0 * i0 * self.times * ( -i0 * exp( -r2eff * 
self.times) + self.values) * exp( -r2eff * self.times ) / self.errors**2
+        d_chi2_d_r2eff = sum( 2.0 * i0 * self.times * ( -i0 * exp( -r2eff * 
self.times) + self.values) * exp( -r2eff * self.times ) / self.errors**2 )
 
         # Make partial derivative, with respect to i0.
         # d_chi2_d_i0 = -2.0*(-i0*exp(-r2eff*times) + 
values)*exp(-r2eff*times)/errors**2
-        d_chi2_d_i0 = - 2.0 * ( -i0 * exp( -r2eff * self.times) + 
self.values) * exp( -r2eff * self.times) / self.errors**2
+        d_chi2_d_i0 = sum ( - 2.0 * ( -i0 * exp( -r2eff * self.times) + 
self.values) * exp( -r2eff * self.times) / self.errors**2 )
 
         # Define Jacobian as m rows with function derivatives and n columns 
of parameters.
-        jacobian_matrix = transpose(array( [d_chi2_d_r2eff , d_chi2_d_i0] ) )
-
-        #print jacobian_matrix
+        #jacobian_matrix = transpose(array( [d_chi2_d_r2eff , d_chi2_d_i0] ) 
)
+        jacobian_matrix = array( [d_chi2_d_r2eff , d_chi2_d_i0] ) 
 
         # Return Jacobian matrix.
         return jacobian_matrix
@@ -439,7 +425,7 @@
 # 'minfx'
 # 'scipy.optimize.leastsq'
 # 'scipy.optimize.fmin_cg'
-def estimate_r2eff(spin_id=None, ftol=1e-15, xtol=1e-15, maxfev=10000000, 
factor=100.0, method='scipy.optimize.leastsq', verbosity=1):
+def estimate_r2eff(spin_id=None, ftol=1e-15, xtol=1e-15, maxfev=10000000, 
factor=100.0, method='minfx', verbosity=1):
     """Estimate r2eff and errors by exponential curve fitting with 
scipy.optimize.leastsq.
 
     scipy.optimize.leastsq is a wrapper around MINPACK's lmdif and lmder 
algorithms.
@@ -782,7 +768,24 @@
     x0 = asarray( E.estimate_x0_exp() )
 
     # Set the min_algor.
-    E.set_settings_minfx(min_algor='simplex')
+    #min_algor='simplex'
+
+    # Steepest descent uses the gradient.
+    min_algor = 'Steepest descent'
+    max_iterations = 1000
+
+    # Newton does not work.
+    # min_algor = 'newton'
+
+    # Newton-CG does not work.
+    # min_algor = 'Newton-CG'
+
+
+
+    # Also not work.#
+    # min_algor = 'Fletcher-Reeves'
+
+    E.set_settings_minfx(min_algor=min_algor, max_iterations=max_iterations)
 
     # Define function to minimise for minfx.
     if match('^[Ss]implex$', E.min_algor):
@@ -791,7 +794,7 @@
         dfunc = None
         d2func = None
     else:
-        func = E.func_exp_chi2
+        func = E.func_exp
         dfunc = E.func_exp_grad
         d2func = E.func_exp_hess
 




Related Messages


Powered by MHonArc, Updated Tue Aug 26 13:40:03 2014