mailRE: using relax on supercomputer


Others Months | Index by Date | Thread Index
>>   [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Header


Content

Posted by Lora Picton on June 17, 2015 - 17:50:
I was able to get the modelfree analysis running on the Cray computer with a 
bunch of help from the staff there. We basically called in the correct 
versions of python, scipy, numpy and mpi4py and also implemented a virtual 
environment to run the relax program in. I'm not an expert at all in these 
things, but I believe that the virtualenv is necessary because of the 
architecture of the Cray, and might not be needed for a small cluster.

Below is the commands I used to do this, and the submission script that I 
have to use for our queue. The specific paths to the modules will be spcific 
to your system, of course.

I run the local_tm script first, then sphere, prolate, oblate and ellipsoid 
in parallel. When they are done, I run the final script. As long as the data, 
the script and the submission instructions are all in the same directory, I 
haven't had any problems.

Hope this is helpful,
Lora

Installation
#loaded version of python and libraries I want to use:

module load python/2.7.3
export PATH=/soft/python/2.7/2.7.3/python/bin/:$PATH
export LD_LIBRARY_PATH=/soft/python/2.7/2.7.3/python/lib:$LD_LIBRARY_PATH
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/numpy/1.7.0/lib/python2.7/site-packages
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/scipy/0.12.0/lib/python2.7/site-packages
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/mpi4py/1.3/lib/python2.7/site-packages

#install virtualenv

wget 
https://pypi.python.org/packages/source/v/virtualenv/virtualenv-12.1.1.tar.gz#md5=901ecbf302f5de9fdb31d843290b7217
tar -xvf virtualenv-12.1.1.tar.gz
cd virtualenv-12.1.1
python virtualenv.py myVE
virtualenv ENV

#inside virtualenv dir I installed relax

wget http://download.gna.org/relax/relax-3.3.8.src.tar.bz2
tar xvjf relax-3.3.8.src.tar.bz2
cd relax-3.3.8
python -m compileall .

#after that was completed
export 
LD_LIBRARY_PATH=/lustre/beagle2/ams/python_ins/virtualenv-12.1.1/relax-3.3.8/lib:$LD_LIBRARY_PATH

Submission script
#Aftert that I tried to run it on compute nodes via PBS script:

#!/bin/bash
#PBS -N test
#PBS -q batch
#PBS -l walltime=00:30:00
#PBS -l mppwidth=64
#PBS -j oe

. /opt/modules/default/init/bash
module swap PrgEnv-cray PrgEnv-gnu
module load python/2.7.3
export PATH=/soft/python/2.7/2.7.3/python/bin/:$PATH
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/numpy/1.7.0/lib/python2.7/site-packages
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/scipy/0.12.0/lib/python2.7/site-packages
export 
PYTHONPATH=${PYTHONPATH}:/soft/python/2.7/2.7.3/modules/mpi4py/1.3/lib/python2.7/site-packages
export 
LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/lustre/beagle2/ams/python_ins/virtualenv-12.1.1/relax-3.3.8/lib

cd $PBS_O_WORKDIR

aprun -n 2 python 
/lustre/beagle2/ams/python_ins/virtualenv-12.1.1/relax-3.3.8/relax 
--multi='mpi4py' --tee log.localtm 
/lustre/beagle2/ams/python_ins/SR1_localtm.py

________________________________________
From: edward.dauvergne@xxxxxxxxx [edward.dauvergne@xxxxxxxxx] on behalf of 
Edward d'Auvergne [edward@xxxxxxxxxxxxx]
Sent: Friday, May 15, 2015 12:59 PM
To: Lora Picton
Cc: relax-users@xxxxxxx
Subject: Re: using relax on supercomputer

Hi Lora,

Please see below:

I have two questions about doing this.
1. Can I use the GUI to set everything up and then instead of starting it, 
save the scripts so that I can direct them to be started in the queue, or 
do I have to modify each script with a text editor?

Yes, just save the state and then create a basic script that loads
that state and then execute the auto-analysis.  You just need a
pipe.create user function call and the last line of the
dauvergne_protocol.py sample script (and an import from the top).  It
can all be done in a 3 line script, if you wish.


2. When starting the full analysis with a script UI mode (which I need to 
do to submit the job), the manual says you need 6 scripts, one for each 
diffusion model. Does this mean that I will need to submit multiple jobs to 
a queue, one for each model? If yes, would the first 5 need to be done 
before the "final" is used? Or would it be possible to direct all of them 
to occur in one large script?

This is optional for the auto-analysis, see the diff_model argument:

http://www.nmr-relax.com/api/3.3/auto_analyses.dauvergne_protocol.dAuvergne_protocol-class.html#__init__

You should use this documentation to help set up the auto-analysis in
script mode.

Regards,

Edward



Related Messages


Powered by MHonArc, Updated Wed Jun 17 18:00:13 2015