Hi, Please see below: On 1 September 2014 10:20, Troels Emtekær Linnet <tlinnet@xxxxxxxxxxxxx> wrote:
Yes. That was a seriously hard bug to find. Especially when you consider the MC simulations as the "Golden Standard". And then the "Golden Standard" is wrong... Ouch.
True! But there are bugs everywhere and you should never assume that parts of relax, or any software in general is bug free. Never trust black-boxes! This is a good lesson ;) All software has bugs, and this will not be the last in relax.
Should we set the GUI to have exp_sim = -1? There is no assumption, that 100000 simulations of exponential fits are better than the co-variance.
Just in case someone has much harder to optimise peak intensity data, for example if the data is extremely noisy or if it is not exactly mono-exponential, then this is not a good idea. It is better to spend time to obtain the best result rather than obtaining a quick result which in most cases works, but is known to theoretically fail. You don't want to be in that theoretical failure group and not know about it. So the user can set it themselves but they should compare it to MC simulations anyway to be sure. Note that the curvature of the optimisation space for the dispersion models is far more complicated than the 2-parameter exponential curves. For the dispersion models, the covariance matrix approach will not be anywhere near as good. For these models, most big names in the field would never consider the covariance matrix approach. Many people are wary of the edge-case failures of this technique. For the best results you would always pick the best technique which, by statistical theory, is Monte Carlo simulations by far.
Btw. Can we check Monte-Carlo simulations for the dispersion models?
That's a great idea! This is probably also untested in the test suite. The covariance matrix approach is perfect for checking that the Monte Carlo results are reasonable. However you do require the Jacobian matrix which is not derived for any dispersion model. There are no gradients derived, though it could be done numerically in the test_suite/shared_data/dispersion directories using the very useful Numdifftools package (https://pypi.python.org/pypi/Numdifftools). Or an even better way would be to create the error_analysis.covariance_matrix user function which, like the pipe_control.monte_carlo module, uses the specific API to obtain, in this case the Jacobian and weighting matrix via one new method, calls lib.statistics.multifit_covar() to create the covariance matrix, and then calls the API again to set the errors via the already existing api.set_error() API method. Then you can use the covariance matrix approach for all the dispersion models. Due to the licencing of Numdifftools, we could even bundle that with relax in the extern package and use numerical Jacobian integration so that even the numeric dispersion models can have a covariance matrix.
Where is that performed?
The specific analysis API. See the functions in the pipe_control.monte_carlo module. The API object is obtained as: # The specific analysis API object. api = return_api() Then you can see the methods called in specific_analyses.relax_disp.api as, for example in the pipe_control.monte_carlo module: # Create the Monte Carlo data. if method == 'back_calc': data = api.create_mc_data(data_index) You will therefore find the create_mc_data() method in the dispersion API module. If you search for all of the api.*() calls, then you'll find all those methods in the API object (or the default with a RelaxImplementError in the API base class). It's rather simple.
Do you randomize only R1rho' or do you also include randomize for R1?
This is performed in the pipe_control.monte_carlo.create_data() function. See if you can trace the API method calls back and find the source of this! It would be good if you check as well. Cheers, Edward