Raven is the new (late 2012) ARCCA cluster. It has 2000 Sandybridge cores with fast interconnect, and an additional 800 cores of Westmere which are set up for serial jobs.
If you have an account, you can log on with:
ssh ravenlogin.arcca.cf.ac.uk
There is a lot of software available. You can see what there is by typing
module avail
You load a module by typing, e.g.
module load python/2.6.7
and can see what you currently have loaded with
module list
There is a bunch of LSC software that you need to have installed (before even thinking about installing your analysis software) so that you can run. It is detailed at this DASWG page. Running through the list of things:
module load gsl
to pick this up. Current version is 1.15, which is more recent than version 1.12 listed on the DASWG page.
module load git
. /home/spxsf2/opt/lscsoft/lscsoft-user-env.sh
I think that the majority of this is unnecessary. For now, I'll just source it, but if we do try installing our software as modules, then I think we should look at what's really needed here.
The LALSuite software contains much of the LSC's analysis code, including the CBC analysis. Instructions for getting it are at https://www.lsc-group.phys.uwm.edu/daswg/docs/howto/lal-install.html. The key steps are:
git clone albert.einstein@ligo-vcs.phys.uwm.edu:/usr/local/git/lalsuite.git
./00boot mkdir build_master cd build_master ../configure --prefix=/home/spxsf2/opt/lalsuite/master make make install
. /home/spxsf2/opt/lalsuite/master/etc/lscsoftsrc
I have successfully run a piece of LAL code (lalapps_tmpltbank to be precise), and it seems to have worked!
Instructions to install the LIGO Data Grid Client from source are taken from here.
wget http://www.globus.org/ftppub/gt5/5.2/5.2.0/installers/src/gt5.2.0-all-source-installer.tar.gz tar xf gt5.2.0-all-source-installer.tar.gz mkdir gt5.2.0-all export GLOBUS_LOCATION=~/gt5.2.0-all/ export PATH=/bin:/usr/bin; export FLAVOUR=gcc64dbg cd gt5.2.0-all-source-installer ./configure --prefix=$GLOBUS_LOCATION --with-flavor=$FLAVOUR make gsi-openssh make postinstall . $GLOBUS_LOCATION/etc/globus-user-env.sh
The VDT Certificate Bundle can be installed using the instructions from the same page. (Note the link to the archive is out of date). I was also found necessary to update the certificates.
wget http://software.grid.iu.edu/pacman/cadist/1.32/osg-certificates-1.32.tar.gz tar xf osg-certificates-1.32.tar.gz -C $GLOBUS_LOCATION/share globus-update-certificate-dir
Now copy your Grid certificates into the .globus
folder in your home directory and make sure the permissions are correct.
chmod 600 ~/.globus/usercert.pem chmod 400 ~/.globus/userkey.pem
To source the install, you need:
. ~spxph/gt5.2.0-all/etc/globus-user-env.sh
/scratch/LIGO/LDR/
. This can be found using, for example
ligo_data_find --observatory L --url-type file --gps-start-time 832326736 --gps-end-time 832328926 --output L-L1_RDS_C03_L2_CACHE-832326736-2190.lcf --lal-cache --type L1_RDS_C03_L2 --match localhost --server=ldr-arcca.phys.uwm.edu
module load intel/intel module load bullxmpi/bullxmpi-1.1.17.1
make clean && make -j 8 bam
#!/bin/bash #PBS -q workq #PBS -l select=8:ncpus=16:mpiprocs=16 #PBS -l place=scatter:excl #PBS -l walltime=1:00:00 #PBS -N R6_PN_64_128 #PBS -o R6_PN_64.out #PBS -e R6_PN_64.err #PROJECT=PR37 pardir=/home/spxmp/MachineConfig/ARCCA/ parfile=R6_PN_64.par bamexe=/home/spxmp/MachineConfig/bam/exe/bam cd /scratch/spxmp cp $pardir/$parfile . mpirun -np 128 $bamexe ./$parfile