mailThe need for speeding things up in dispersion analysis


Others Months | Index by Date | Thread Index
>>   [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Header


Content

Posted by Troels Emtekær Linnet on March 27, 2014 - 17:43:
Dear Edward.

I am working on a systematic investigations of dynamic parameters for hundreds
of datasets.

For one example, a CPMG analysis is setup for:
17 variations of tau_cpmg
The number of MC simulations is 50.
82 spins which are all clustered.

There is no grid search, and only TSMFK01 is used.
I do one grid search in the start, minimise this, copy over the
parameters and take median, make a clustering analysis, and then
repeat the last step 60 times.
This would again would be needed to repeat 5-8 times for other
datasets with variations.
And then for other proteins. (Sigh..)

I have setup relax to use 20 processors on our server, and a
dispersion analysis takes
between 2-6 Hours.

That is a reasonable timeframe for an normal analysis of this type.

But I have to squeeze hundreds of these analysis through relax, to get
variation of the dynamic parameters.

Our old Igor Pro scripts, could do a global fitting in 10 minutes.
That does not include MC simulations.

But I wonder if I could speed up relax by changing function tolerance
and maximum number of iterations:
minimise(min_algor='simplex', line_search=None, hessian_mod=None,
hessian_type=None, func_tol=OPT_FUNC_TOL, grad_tol=None,
max_iter=OPT_MAX_ITERATIONS, constraints=True, scaling=True,
verbosity=1)

where standard values of:
OPT_FUNC_TOL = 1e-25
OPT_MAX_ITERATIONS = 10000000

Could you advise if this strategy is possible?

What I hope for, is that an analysis come down to 10-20 minutes?
Maybe I could cut away the MC simulations, since I am mostly
interested in the fitted dynamic parameters, and not so much about
their error?

Thank you in advance!

Best
Troels



Related Messages


Powered by MHonArc, Updated Thu Mar 27 19:20:13 2014