> > Oh, and one last thought. It would be nice to automatically scale the > > tensor size for the size of the protein, so we don't have to do > > trial-and-error adjustments of the scaling factor. The current fixed > > default scaling factor means larger proteins will have smaller tensors > > and vice-versa. This is something I could have a go at implimenting if > > others think its a godd idea? > > The reason I have used a scaling factor is twofold. Firstly for > comparing two states or systems, you need to have exactly the same > scaling in both analyses. Secondly the scaling factor should probably > be given with the figure. The value is important - it is the > diffusion rate per Angstrom within the figure. I should add this to > the user function docstring. >
What I had in mind was a default auto-scaling that can be overiden as required. ie.
def pdb.create_tensor_pdb(run, file, scaling): if scaling == None: scaling = autoScale() ...
In our experience of using this sort of functionality with tensor, its most common use is for a quick look to check that the tensor looks reasonable, rather than for quantitative comparisons. Given this, a default auto-scaling will give the most commonly desired behaviour with minimal effort, but can easily be overriden when quantitation is required.
How about the accepting the scale argument value 'auto'. I prefer it not to be the default so that user knows that they will get a diffusion rate of 1.8e6 s^-1 per Angstrom. And I would recommend that all figures of the diffusion tensor be labelled with the diffusion rate per Angstrom. Hence the auto-scaling should report the scale value in an obvious way.
An important question is how would you define 'auto'? A number of algorithms would be required. Would you loop over all atoms of all residues and pick the atom furthest away from the centre of mass? How would you handle multi-domain systems? If there are multiple tensors, then they will have to be all scaled by the same amount. Or what about complexes?
Edward