Search


DASWG LAL Doxygen

Docs

How-To's
Technical
Software Docs
Minutes

Download

Browse CVS, Git, or SVN
Software Repositories
OS Security Updates
LIGO software virtual machine
VMware SL6 Install

Participate

SCCB - Software Change Control Board
Edit these pages
Sub-committees
Mailing List
Telecon

Projects

DMT
DQSEGDB
Glue
GraceDB
gstlal
LALSuite
LDAS Tools
LDG Client/Server
LDR
ligoDV
LIGOtools
LVAlert Administration
LVAlert
MatApps
Metaio
NDS Client
PyLAL
LSCSOFT VM

Legacy Projects

geopp
LDAS
LDM
LIGOtools
LSCGIS
Onasys
OSG-LIGO

Installation and setup notes for onasys

Introduction

This document contains details of how onasys was set up to do the online inspiral analysis in E12 at LLO. It is not meant as a complete and comprehensive HOWTO guide, although hopefully it will evolve into one.

I started from fresh on ldas-grid.ligo-la. This is a FC3 box with Condor 6.7.3. The lscsoft rpms have been installed from the FC3 yum repository by Stuart and my environment is configured to use them.

Install LAL and LALApps

Checked out and installed lal and lalapps in the usual way. I installed LAL in my home directory with

cd projects
cvs -d :pserver:duncan@gravity.phys.uwm.edu:2402/usr/local/cvs/lscsoft co lal
cd lal
./configure --prefix=${HOME} --disable-shared
make
make install
and then added
LAL_LOCATION=${HOME}
export LAL_LOCATION
. ${LAL_LOCATION}/etc/lal-user-env.sh
to my .bash_profile, logout then log in. LALApps was configured to use condor with
cd ~/projects
cvs -d :pserver:duncan@gravity.phys.uwm.edu:2402/usr/local/cvs/lscsoft co lalapps
cd lalapps
./configure --prefix=${HOME} --enable-condor
make
make install
so the LALApps programs are installed in ${HOME}/bin.

Install Glue

Installed glue in my home directory with

cd ~/projects
cvs -d :pserver:duncan@gravity.phys.uwm.edu:2402/usr/local/cvs/lscsoft co glue
cd glue
python setup.py install --home=${HOME}
Added
export GLUE_LOCATION=${HOME}
if [ -f ${GLUE_LOCATION}/etc/glue-user-env.sh ] ; then
  source ${GLUE_LOCATION}/etc/glue-user-env.sh
fi
to my .bash_profile, logout and login. I am now picking up my own LSCdataFind and LSCsegFind from glue:
[dbrown@ldas-grid.ligo-la ~]$ which LSCdataFind
~/bin/LSCdataFind
[dbrown@ldas-grid.ligo-la ~]$ which LSCsegFind
~/bin/LSCsegFind

Install Onasys

Checked onasys out of CVS into ~/projects/lsware and installed into ~/projects/onasys with the commands

cd ~/projects
cvs -d :pserver:duncan@gravity.phys.uwm.edu:2402/usr/local/cvs/lscsoft co lsware/onasys
cd lsware/onasys
./00boot.sh
./configure --prefix=${HOME}/projects/onasys
make
make install
Created a directory for the BOSS configuration file
mkdir ~/projects/onasys/share/boss
cp ~kipp/BossConfig.clad ~/projects/onasys/share/boss
The clad file contains the location of the BOSS databases (on Brian's machine at UWM). Edited this file to change the BOSS_TOP_DIR variable to
BOSS_TOP_DIR = "/data2/dbrown/projects/onasys";
although this was incorrect in Kipp's file, so I guess it doesn't matter what it is set to. Added the following lines to my .bash_profile to get the contents of onasys and boss
BOSSDIR=${HOME}/projects/onasys/share/boss
PYTHONPATH=${HOME}/projects/onasys/lib/python2.3/site-packages:${PYTHONPATH}
PATH=${PATH}:${HOME}/projects/onasys/bin
export BOSSDIR PYTHONPATH PATH
Note: the exact PYTHONPATH depends on the installed version of python, but this is 2.3 at both sites.

I then checked boss was working with the command

[dbrown@ldas-grid.ligo-la ~]$ boss showSchedulers
condor
dagman
dagmandev
fork
which shows it's sucessfully talking to the database at UWM. boss_q also returns reasonable output.

Set up the Online Inspiral DAG

The online inspiral DAG is created by lalapps_inspiral_online_pipe and configured by online.ini.

Due to the problem with glob() not working in the standard universe, I need the following helper script, which is run as a post script by the mkcalfac job in the dag

ln -s ~/projects/lalapps/src/inspiral/mkcatalog.sh
This script takes one argument, which is the path to the reference calibration frame, creates a symlink in the pwd to this file and then builds a LAL calibration cache containing and frames in the pwd (which will be the reference frame just symlinked and the factor frame created by the mkcalfac job).

I created a directory to run the search in and copied the ini file to this directory

mkdir -p ~/projects/iul/E12/L1
cd ~/projects/iul/E12/L1
cp ~/share/lalapps/online.ini .
This is online.ini 1.8 from CVS which is set up for an E12 L1 run using the calibration data from Gaby.

Linked the DAG generation script from the lalapps CVS into the L1 run directory

ln -s ~/projects/lalapps/src/inspiral/mkdag.sh
This script is the interface between onasys and lalapps_inspiral_online_pipe. It turns the start and stop times passed to it by onasys into a segwizard format segments file (called segment.txt) and then runs the pipe script to generate the DAG, which onasys runs. Note that this script has the condor log path hard wired to /usr1/dbrown/E12/L1 which will need to be changed per site/ifo.

Made a directory under /cluster to dump the results so they are visible to the DMT:

mkdir -p /cluster/inspiral/E12/L1

Configure Onasys for Online Running

The onasysd.ini file is configured as follows:

[onasysd]
update_interval = 3000
hold_off = 300
segfind_server = ldas-gridmon.ligo-la.caltech.edu
segfind_type = E12L1
datafind_observatory = L
datafind_type = R
dryrun = False

[pipeline]
dag_generator = mkdag.sh
dag_base_name = inspiral_E12L1
min_segment_length = 2048
overlap = 128

Launch the Jobs

Don't forget to initialize a grid-proxy at the site, so LSCdataFind can continue to run after you log out (1000 hours should be long enough for E12):

[dbrown@ldas-grid.ligo-la ~]$ unset X509_USER_PROXY 
[dbrown@ldas-grid.ligo-la ~]$ grid-proxy-init -hours 1000

Start onasys with

[dbrown@ldas-grid.ligo-la ~]$ cd projects/iul/E12/L1
[dbrown@ldas-grid.ligo-la L1]$ nohup onasysd &> nohup.out < /dev/null &
and it seems to work!

InspiralMon

InspiralMon at LLO is run on delaronde and at LHO on sand.

$Id$