Greetings,
I have moved on to running the full_analysis.py script for the sphere model and am encountering an error in the size of the grid search for the diffusion tensor (see traceback and output logs below). I am using experimental input data of my own (~246 residues) that has been formatted to match the test files that I have used previously. The only alteration to the script after a successful local_tm run was to specify sphere as the diffusion model. This error does not occur when I run the test files from /test_suite/shared_data/model_free/S2
0.149, so I am not sure what is going on. The reference pdb I am using is the output of the Palmer r2r1_diffusion script.
As always, I appreciate any help or suggestions you might have to offer. Enjoy a great morning!
Best regards,
Chris
Log File (edited)
relax> pipe.create(pipe_name='sphere', pipe_type='mf')
relax> results.read(file='results', dir='local_tm/aic')
Opening the file 'local_tm/aic/results' for reading.
relax> model_free.remove_tm(spin_id=None)
relax> structure.read_pdb(file='~/relax/relax-1.3.2/relax-1.3/sample_scripts/data/1fgu_mfdiffusion.pdb', dir=None, model=None, parser='scientific')
Scientific Python PDB parser.
Loading all structures from the PDB file.
Structure('/home/broseyca/relax/relax-1.3.2/relax-1.3/sample_scripts/data/1fgu_mfdiffusion.pdb'):
Peptide chain of length 246
relax> structure.vectors(attached='H', spin_id=None, struct_index=None, verbosity=1, ave=True, unit=True)
Extracting vectors from the single structure.
Calculating the unit vectors.
The attached atom is a proton.
Extracted N-H vectors for ':187@N'.
Extracted N-H vectors for ':190@N'.
Extracted N-H vectors for ':193@N'.
Extracted N-H vectors for ':194@N'.
relax> diffusion_tensor.init(params=1e-08, time_scale=1.0, d_scale=1.0, angle_units='deg', param_types=0, spheroid_type=None, fixed=False)
relax> fix(element='all_spins', fixed=True)
relax> grid_search(lower=None, upper=None, inc=11, constraints=True, verbosity=1)
The diffusion tensor parameters together with the model-free parameters for all spins will be used.
Unconstrained grid search size: 103535780163953945786013051631026572323548694215165037063459636004169686022181124545004627721606706571L (constraints may decrease this size).
relax> state.save(state='relax_state_20081311_102400', dir_name=None, force=False, compress_type=1)
Opening the file 'relax_state_20081311_102400.bz2' for writing.
Traceback Log
66 sweet:/home/broseyca/relax/relax-1.3.2/relax-1.3% Traceback (most recent call last):
File "sample_scripts/full_analysis2.py", line 665, in <module>
Main(self.relax)
File "sample_scripts/full_analysis2.py", line 276, in __init__
grid_search(inc=inc)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/prompt/minimisation.py", line 156, in grid_search
minimise.grid_search(lower=lower, upper=upper, inc=inc, constraints=constraints, verbosity=verbosity)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/generic_fns/minimise.py", line 185, in grid_search
grid_search(lower=lower, upper=upper, inc=inc, constraints=constraints, verbosity=verbosity)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/specific_fns/model_free/mf_minimise.py", line 479, in grid_search
self.minimise(min_algor='grid', lower=lower, upper=upper, inc=inc, constraints=constraints, verbosity=verbosity, sim_index=sim_index)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/specific_fns/model_free/mf_minimise.py", line 921, in minimise
min_options = self.grid_search_config(num_params, spin=spin, lower=lower, upper=upper, inc=inc, scaling_matrix=scaling_matrix)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/specific_fns/model_free/mf_minimise.py", line 551, in grid_search_config
self.test_grid_size(min_options, verbosity=verbosity)
File "/home/broseyca/relax/relax-1.3.2/relax-1.3/specific_fns/model_free/mf_minimise.py", line 1321, in test_grid_size
raise RelaxError, "A grid search of size " + `grid_size` + " is too large."
RelaxError: RelaxError: A grid search of size 103535780163953945786013051631026572323548694215165037063459636004169686022181124545004627721606706571L is too large.