LSC-Virgo Burst Analysis Working Group

Navigation

Burst Group Home
Wiki
ExtTrig Home
CBC Group Home
LSC, LIGO, Virgo

Documents

'How-to' docs
Agendas/minutes
[2008, 2007, earlier]
Talks/Posters [pre-wiki]
Papers
Paper plans
White papers
ViewCVS

Investigations

Analysis projects
Old notebook [General, S2, S3, S4, S5]
Virgo workarea
External collabs

Review

Main review page
Telecons

S5 AstroBurst search

Technical Documentation for the S5 Astrophysical Burst Search

Abstract

This describes a gravitational-wave burst search being carried out on S5 data with the BlockNormal pipeline followed by a network correlation analysis. The pipeline selects triple-coincident (H1-H2-L1) trigger candidates. The network correlation analysis is performed by the CorrPower package on the candidate triggers. The results will be expressed in astrophysically-motivated terms of a rate-density limit on GW burst sources within the galaxy. The search aims to detect bursts with frequency content in the range 96-1984 Hz and duration less than one second. The final trigger pipeline and network correlation thresholds will be set to yield an expected background rate of roughly 0.1 detected waveforms in the zero-lag sample per 12 months of S5 operation (about 14 x 106 s of triple-coincident live-time).

This search is based at Penn State University and Andrews University. Team members include Shantanu Desai, Sam Finn, Malik Rakhmanov, Amber Stuver, Tiffany Summerscales and Keith Thorne.

This document is intended to describe the analysis procedure and event selection choices in moderate detail, with links to other reference materials containing more detail.

Previous Results

An analysis of the S2 data was completed with the inaugural BlockNormal pipeline. These S2 results were initially reported at the March 2004 LSC meeting (PDF, PowerPoint) with an update at the May 2004 Burst Face-to-Face (PDF, PowerPoint)
Subsequently, an integrated search for per-IFO vetos was added, and the 'r-Statistic' waveform consistency test was appended. This upgraded pipeline was used both diagnostically during the S4 run and to complete a search for burst signals in the S4 data. These S4 results were reported at the August 2005 LSC meeting (PDF, PowerPoint).

Reports and Presentations

S5 Analysis Web Pages main page
March 2006 Burst Group Face-to-Face PDF
June 2006 LSC Meeting (LIGO G060251-00-Z) PowerPoint PDF
August 2006 Burst Group Face-to-Face (LIGO G060391-00-Z) PDF
August 2006 LSC Meeting (LIGO G060434-00-Z) PDF
November 2006 Burst Group Face-to-Face PowerPoint
PDF November 2006 LSC Meeting (LIGO G060564-00-Z) PowerPoint PDF July 2007 Burst Group Face-to-Face (LIGO G070494-00-Z) PowerPoint PDF July 2007 Burst Group Face-to-Face - Review Readiness (LIGO G070493-00-Z) PowerPoint PDF October 2007 Burst Group Face-to-Face (LIGO G070684-00-Z) PowerPoint PDF

Software Tools Used

The BlockNormal pipeline uses the BlockNormal Event Trigger Generator ('BNETG') software to generate per-IFO lists of events. This same BNETG software is used to identify any per-IFO veto events from one or more auxiliary detector channels. The pipeline then uses single-IFO event clustering and multi-IFO coincidence routines to create candidate triggers from non-vetoed events, each representing a time of coincident excess power in all three LIGO detectors. Coincidences are also formed for about 100 non-zero time lags for background rate measurements.
The candidate triggers are processed with the 'CorrPower' software package to evaluate the multiple-IFO correlation significance.
The detection sensitivity will be established by analysis of software-simulated GW burst signals. The 'BurstMDC' software package is used to generated these simulated signals.
Calibration of the candidate triggers, as well as the simulated signal generation, will rely on the 'hperasq' software package.

BlockNormal

The BlockNormal pipeline has been written and maintained by Shantanu Desai, Sam Finn, John McNabb, Amber Stuver, Tiffany Summerscales and Keith Thorne. It was briefly described in Class. Quantum Grav. 21 (2004) S1705-S1710. A more complete description is in preparation for publication. The current draft is available in PDF and PostScript formats.

The published copy of the source code is in the DASWG 'lscsoft' CVS repository. The BlockNormal pipeline, written in MATLAB, is found in the /matapps/src/searches/burst/BlockNormal subdirectory. Another way to view the repository is through the nightly 'matapps' web page which is geared to the MATLAB documentation style and allows easy traversal of calling trees.

The BlockNormal software performs a time-domain search of each IFO's GW channel data. It identifies points in the time series where the statistical nature of the data (characterized by its mean and variance) changes. These 'change-points' define the boundaries of time intervals (blocks) with different normal-distribution statistics (hence the name). Each block's mean and variance are used to calculate its relative excess power compared to the long-term average mean and variance of the data. Blocks above a (typically low) threshold on relative excess power are kept as per-IFO candidate events.

To provide some frequency resolution and to simplify the filters required to whiten the data, the GW channel data is separated into frequency bands which are each processed separately by the BlockNormal software. The data-conditioning is broken down into Kalman discrete-line filters, regression filters for lines tracked by auxiliary channels, and final whitening filters. This whitening normalizes the statistics of the data then processed with the BlockNormal algorithm. A detailed description of the BlockNormal data-conditioning filters in previous analyses (S2-S4) is available in LIGO Technical Note T060011-00-Z T060011-00-Z.pdf">

This initial processing is followed by a 'clustering' step that applies the final (higher) relative excess power threshold, combines time-adjacent events into clusters and calculates the calibrated strain. This is still done separately for each IFO and frequency band. This step is computationally inexpensive and can be repeated as needed to refine final thresholds and re-calibrate strains. A 'veto' step then rejects all such clustered per-IFO events that overlap veto events found in one or more auxiliary channels. The un-vetoed clustered events are then passed to a 'coincidence' step that first searches in each frequency band for coincident events in all three LIGO detectors. It then combines triggers between adjacent bands that overlap in time. This produces the list of candidate triggers.

The coincidence step also considers a large number of nonzero time shifts for one site relative to the other, and finds candidate triggers for every time shift. These are used for background studies and background event rate estimates.

The BlockNormal pipeline is composed of shell scripts that wrap each compiled MATLAB executable to set environment variables. These are tied together as a Condor Directed-Acyclic-Graph (DAG) script and run at present on the Penn State 'Pleiades' LSC Tier 2 Grid computer.

Reviews

The BlockNormal pipeline was reviewed after the S2 analysis. This review was completed in November, 2004, and covered the existing BlockNormal data-conditioning, algorithm, clustering and coincidence codes. It was conducted by Soumya Mohanty, Soma Mukherjee and Brian O'Reilly. The BlockNormal review page prepared by the Penn State group has information, numerous links to documentation, results of validation studies, etc.

The basic BlockNormal pipeline has not changed significantly since that time. That review identified deficiencies in the data-conditioning filters. Errors in the Kalman filtering were corrected, and a Rayleigh-based tuning metric introduced to tune and validate each stage of data-conditioning. The data-conditioning software now uses external files to define all filters. The only changes to the BlockNormal algorithm were the combination of separate mean and variance thresholds into a relative excess power threshold (prior to S4) and a minor re-write of the change-point refinement routine (after S4) to reduce CPU time but without changing the results. The clustering and coincidence stages remain the same.

Software Versions

Initial Tuning

For trigger production for tuning on Period 2 (Feb 9 - May 2, 2005) at the target &rho2 threshold of 4, the appropriate software in the 'matapps' CVS was tagged with 'BlockNormal_S5_v1'. This was used to build MATLAB executables under MATLAB R14 SP3.
Tar-Ball of BlockNormal_S5_v1 source codeBlockNormal_S5_v1-src.tar.gz
Tar-Ball of BlockNormal_s5_v1 executablesBlockNormal_S5_v1-exe.tar.gz

Period S5A

For Period S5A trigger production (Nov 2005 - Apr 2006), the tag was 'BlockNormal_S5_v2'. MATLAB executables were built under MATLAB R2006a.
Tar-Ball of BlockNormal_S5_v2 source codeBlockNormal_S5_v2-src.tar.gz
Web interface to source codemain page

S5 First-Year

For S5 First-Year trigger production (Nov 2005 - Nov 2006), the tag was 'BN_2007-07-09'. MATLAB executables were built under MATLAB R2006a.
Tar-Ball of BN_2007-07-09 source codeBN_2007-07-09-src.tar.gz
Web interface to source codemain page

BlockNormal Vetoes

The BlockNormal Veto software has been written and maintained by Shantanu Desai, Sam Finn and Keith Thorne. The Veto Figure of Merit (FOM) was initially described in LIGO technical note T030181. This was refined during studies of the S3 data and reached its final implementation during the analysis of S4 data. A detailed presentation (PDF)was made at the August 2005 LSC meeting.

The veto-specfic portion of the BlockNormal pipeline, written in MATLAB, is found in the /matapps/src/searches/burst/BlockNormal/Vetoes subdirectory.

The BlockNormal Veto software simply processes the data from likely auxiliary channels in the same frequency bands as used for the GW channel (LSC-DARM_ERR in S5). The same change-point threshold is used as for the GW channel. The software then looks for clustered events from the auxiliary channel that overlap events from the GW channel for that IFO. The relative excess power threshold is tuned (separately for each auxiliary channel) to maximize an Effective Veto FOM. A separate check is made to establish the veto safety of those auxiliary channels used for the final vetos.

Reviews

As stated above, the basic BlockNormal pipeline has been reviewed previously. However, the software implementing the Effective Veto FOM and application of the vetos in the selection of candidate events prior to coincidence has not been reviewed.

An S5 veto documentation web page has been established to organize the veto documentation and describe its application to the S5 analysis.

CorrPower

The r-statistic test, originally formulated by Laura Cadonati, is run within the CorrPower program, written by L. Cadonati and Sz. Márka. It is described in Class. Quantum Grav. 21 (2004) S1695-S1703.
CorrPower is MATLAB code, residing in the MATAPPS archive at matapps/src/searches/burst/CorrPower . Documentation on how to install and run CorrPower is available in matapps/src/searches/burst/CorrPower/Documentation .

The r-statistic test is applied to short data segments flagged by an ETG as potential event candidates. It is a null-hypothesis test for the probability that data from multiple interferometer is not correlated, based on the linear correlation coefficient (Pearson's r). For each interferometer pair and integration window, the code computes a "confidence" C, equal to -log10(S), where S is the probability that a given value of the normalized cross correlation r is computed if the data from two detectors is uncorrelated. A time delay between interferometers is allowed, up to the light travel time between interferometer sites plus 1ms (allowing for calibration mismatches). The final statistic is the arithmetic average of the maximum confidence obtained for each interferometer pair, maximized over time delays and integration windows.

For this analysis, three integration window lengths (20, 50, 100 ms) are used. For each window length, many windows of that length at slightly different starting times are evaluated, covering the data segment with 99% overlap. The allowed time delay between H1 and H2 is 1 ms, and 11 ms between LHO and LLO.
A Q=10 notch of the violin mode at around 350Hz was included to the data conditioning for the test, in order to suppress false correlations due to fluctuations of the violin modes.
As this analysis uses the uncalibrated DARM\_ERR channel, the CorrPower analysis is also performed on that channel. To support the S5 calibration data, a few CorrPower routines were modified to use the 'hperasq' routines. These modifications at described on a web page and stored in the Matapps CVS under matapps/src/searches/burst/BlockNormal/CorrPower_BN. The code is built as the same time as the BlockNormal executables, and shares the same tags.

Reviews

The S2 version of the r-statistic code was reviewed by Katherine Rawlins, Szabi Márka and Jolien Creighton - the report is available at this link.
Most of that MATLAB code has been re-used for CorrPower; Katherine started addressing the code components that are new to CorrPower and the status of her review is described at this web page.

Maximum Entropy

The Maximum Entropy software has been written and maintained by Tiffany Summerscales, Sam Finn and Shantanu Desai. It was developed as part of Tiffany Summerscale's doctoral dissertation. A paper has been prepared for publication (PDF) and released for LSC review. A report on the recovery of LIGO hardware injection waveforms using Maximum Entropy (PDF) was made at the August 2005 LSC Meeting.

The software, written in MATLAB, is currently in the local Penn State SVN repository. It will be published to the 'matapps' CVS repository once initial validations on the S5 pipeline are complete.

The maximum entropy technique has previously been used in radio astronomy data analysis. It is a Bayesian method that attempts to find a signal that maximizes a function combining the detector response, data and noise covariance. It involves a regularizer (equivalent to Shannon information entropy) that ensures smoothness and prevents overfitting.

Present challenges are adapting the software for the untriggered burst search (where the source location is not initially known) and developing an SNR metric to measure the significance of the extracted waveform.

Reviews

The paper on the maximum entropy implementation is currently under LSC review. The software is under development and has not been reviewed.
An S5 coherent network documentation web page has been established to organize the coherent network documentation.

Tikhonov Regularization

The Tikhonov Regularization software has been written and maintained by Malik Rakhmanov, Shantanu Desai, Sam Finn and Keith Thorne. Malik made a presentation (PDF) at GWDAW 10 , and has submitted a proceedings paper to CQG. An update (PDF) was given at the March 2005 Burst Group Face2Face

The software, written in MATLAB, is currently in the local Penn State SVN repository. It will be published to the 'matapps' CVS repository once initial validations on the S5 pipeline are complete.

Tikhonov Regularization addresses a problem inherent in coherent network analyses. The standard approach to the linear inverse problem (given data streams, find the parent gravitational wave components) is direct minimization. This suffers from various problems (discontinuities, strong magnitude asymmetries) that all stem from the rank deficiency of the network response matrix. The application of Tikhonov regularization suppresses the variance of the estimates and gives a biased estimation of the waveform to counteract the asymmetries.

Present challenges include the creation of regulator matrices, de-whitening filters, and development of the SNR metric to measure the significance of the extracted waveform.

Reviews

This software is under development and has not been reviewed.

hperasq Calibration Utility

To simplify access to calibration data, the 'hperasq' package returns response functions and &alpha - &beta parameters based upon GPS time, IFO and GW detector channel.

The published copy of the MATLAB source code is in the DASWG 'lscsoft' CVS repository, in the /matapps/src/utilities/hperasq subdirectory.

The software reads in data from the ASCII-format calibration files provided by the Calibration Group, and supports runs S2 through S5. It has recently been updated to support the preliminary S5 V2 calibration data. It can be customization through the setting of environment variables to specify directories and non-default calibration versions. A caching system is used to reduce the need to re-read data files

Reviews

This software has not been reviewed. Indirect validation occurred for the S4 run through the reconstruction of signal strain values by WaveBurst that had been created with the BurstMDC package.

BurstMDC Simulation Utility

To simplify the creation of large Mock Data Challenge (MDC) sets of simulated GW signals, the BurstMDC package was developed for the Burst Group's S4 data analysis.

The published copy of the MATLAB source code is in the DASWG 'lscsoft' CVS repository, in the /matapps/src/simulation/BurstMDC subdirectory. There is technical documentation available.

The software uses simple waveform specification and segment list files to generate a Condor DAG pipeline and submit for completion to an LSC grid computer. It creates frame files for an arbitrary network of detectors (typically, H2,H2,L1, GEO, TAMA and VIRGO). At the end, it compiles a master log file describing all simulated signals. It supports both one and two-dimensional waveforms.

It has been run for both the S4 and S5 Burst Group analysis on the Penn State grid computer, as well as the CIT grid computer.

Reviews

This software is now being reviewed for S5. A validation of the antenna factors and polarization conventions was carried out during S4, and is documented on this BurstMDC Validation web page. The polarization conventions now match those in the LIGO-VIRGO test data and in the Network Simulation package.

AstroBurst - Astrophysical Interpretation of Burst Results

The results of a search for gravitational-wave bursts can be expressed in terms of sensitivity to a galactic distribution of sources. This can make our results more accessible to the astrophysical community. The inputs are the final detection rate (or upper limit), the sensitivity curves and the distribution of observation periods in sidereal time.

Sam Finn made an initial presentation at the Marcel Grossman meeting in August, 2006. This was summarized in our presentation at the August, 2006 LSC meeting.

The published copy of the MATLAB source code is in the DASWG 'lscsoft' CVS repository, in the /matapps/src/searches/burst/BlockNormal/AstroBurst subdirectory.

Reviews

This material has not been reviewed. We are completing initial validation of the model parameters at Penn State.

S5 Offline Analysis Progess

The analysis team met to lay out the list of S5 Offline Analysis tasks and to prepare an initial task schedule. A document (PDF, PostScript) detailing how the S5 analysis thresholds will be tuned for an astrophysics-based results was prepared.

Segment Lists

Segment lists have been created to identify per-IFO and triple-coincident science mode segments during S5. The Data-Quality (DQ) flags used and segment lists are on our S5 Segment List web page. We made some initial segment lists for both 'Epoch 1' (roughly Nov05 - Jan06) and 'Epoch 2' (roughly Feb06 - Apr06) period during S5 for testing and tuning. We then changed to the Burst Group segment lists for S5 Period A e-notebook and S5 First Year e-notebook

As detailed on the segment list page, we made corrections to the S5 Period A lists for BlockNormal processing that limited segments to < 20,000 seconds each and added a segment break for a change in DARM_ERR_EXC_DAQ resolution. The final segment list for S5A trigger production was S5H1H2L1_anasegs_noinj_BNETG_Pass3.txt. For S5 FirstYear, similar manipulations were made, resulting in S5H1H2L1_1yrsegs_noinj_BNETG.txt

These segment lists only had minimal Category 1 Data-Quality (DQ) flags. This was sufficient for per-IFO event and triple-coincident trigger production. More stringent DQ flag categories defined by the LSC Data Quality group were applied in post-processing.

Data Conditioning Filters

For S5, significant work was done to define the data-conditioning filters using by the BlockNormal software. This work is document on the main S5 Data-Conditioning Web Page. The data-condtioning filter software, written in MATLAB, is found in the /matapps/src/searches/burst/BlockNormal/DataCon subdirectory of the 'lscsoft' CVS repository. This work included
  1. High-pass shaping filters that suppresses the large low-frequency background below 60 Hz in the LIGO data were tuned and checked.
  2. Frequency band filters tha define the frequency bands were tuned and checked. The lower noise floor in the S5 data required stronger filters to suppress aliased background from the violin mode harmonics
  3. Regression filters that remove lines from injected signals (calibration lines, photon calibrator) using data from auxiliary channels were selected and optimized. This work confirmed the suspicion that the 60Hz harmonics can not be reliably removed by such procedures
  4. Kalman filters were designed to remove the remaining strong line (mostly 60Hz harmonics and sidebands). During the first three months (Nov05-Feb06) these were especially bad in H1. After much work, Kalman filters for this first epoch were selected and tuned.
  5. Simple whitening filters were designed and checked.
  6. Some initial validation tests were made. These use the Rayleigh statistic metric developed during the S2 review process to identify problems. These show that each filter stage improves the 'whiteness' of the data.
  7. The filters were measured for each 600-second playground of science mode data in each detector and stored for later use by the BlockNormal pipeline
For the later S5 data (Feb, 2006 onward), it was found that many 60Hz harmonic and sideband filters could be removed for H1. For the Feb, 2006 to May, 2006 period, filters were added to L1 to suppress lines from the optical level harmonics below 200 Hz. After May, 2006, these lines were found to be changeable and drifting and were removed from the filters.

Background Rate Tuning - May 2006

The BlockNormal pipeline has two main thresholds. They are the &rho2 change-point threshold and the PE relative excess power threshold on events. For each &rho2 threshold, the PE threshold can typically be set to acheive a desired false-alarm or background rate. Our tuning would identify several such candidate threshold pairs of [&rho2,PE] that each acheived the target triple-coincidence background rate of 6.7 x 10-6 Hz (prior to any waveform tests).

This background tuning is detailed on the S5 Background Tuning web page. This was initially done on the Epoch 1 data, but then was repeated as we refined threshold selection for Epoch 2 data. As expected, they had almost identical background rates as a function of our thresholds.

MDC Set Creation for Tuning

To determine which [&rho2,PE] candidate threshold pair has the best detection rate, we wanted a set of simulated signals designed to achieve results with minimum processing. As documented on the S5 BurstMDC Generation Page, we created two sets, WNB1_BN_S5 for Epoch 1, and (so far) WNB1B_BN_S5 for Epoch 2 analysis. These sets have only one waveform (Gaussian-envelope white-noise bursts, central frequency 250Hz, bandwidth 100Hz and time width 10ms) whose strains are chosen randomly from a logarithmic distribution. The Epoch 1 set uses the S5 V1 calibration data and the Epoch 2 set uses the S5 V2 calibration data. In each case, the &alpha and &beta parameters are held fixed and equal to 1, as measured time-varying values were not available at the time of MDC generation. These MDC sets have been distributed to the other sites via LDR.

Detection Rate Maximization - May 2006

We then measured the detection rate on the MDC set from Epoch 2. We initially chose a range of &rho2 thresholds from 2 to 5 that bracketed the &rho2=3 threshold used in our S4 analysis. We expected the detection rate to peak at a similar value, but the Pass 1 study instead maximized at both &rho2=4 and 5. To check that we got an optimum and not a plateau, we needed to extend the range of &rho2 above that and analyze a longer time period to improve statistics. This first required some additional background rate runs for &rho2 = 6 and 7. We also re-scaled the MDC signals to better probe our sensitivity at low strain. This Pass 2 study confirmed that we have found a well-behaved optimum and had acheived an even lower h50. We could now proceed with full data set processing.

Coincidence Criteria Improvement - June-Nov 2006

We looked for improvements to our coincidence criteria that would enhance background rejections. We had not changed the coincidence timing windows or threshold method for event selection since the S2 analysis. Investigations were carried out on coincidence timing and thresholding methods. We also upgraded CorrPower for S5 DARM_ERR analysis, adding in the latest calibrations and the violin mode notch filters.

Improved Coincidence Timing

Initially a 30 ms timing window was used for all detector pairs. As detailed in this H1-H2 Coincidence Window page, initial tests on short-duration MDCs showed that the H1-H2 timing window could be reduced without a loss in MDC detection rate. However, when we analyzed an MDC set (WNB3B_BN_S5) of 100ms duration white-noise bursts, we found that even the 30 ms window was too narrow for LHO-LLO coincidences. This lead us to implement a new event timing method that weighted the centroid by the variance of each time sample. This "variance-weighted centroid method" greatly reduced the timing differences between detectors for MDC events. To set the new coincidence windows, we examined the yield of MDC triggers as a function of coincidence timing window widths (Coincidence Window Selection web page, using 10ms white-noise bursts and Q=100 Sine-Gaussians. The result were to keep 30ms windows for LHO-LLO and drop to 20ms for H1-H2.

Coincidence Power Threshold

Initially, triple-coincident triggers required the events from each IFO to exceed the same threshold on relative excess power (PE). This limits sensitivity to that of the least sensitive detector (typically H2). As documented in this Coincident Power Selection web page, we found that a metric termed "Combined Power", defined as PCMB = cube-root(PE(H1)*PE(H2)*PE(L1)), gave the highest MDC detection rate for a fixed background rate.

CorrPower Upgrades for S5 - Sept 2006

For S5, we added in the violin mode notches used in the WaveBurst+CorrPower analysis, and added in support for the S5 calibration data so we could use it with DARM_ERR data. We found (web page) that the violin mode notches resulted in CorrPower Γ distributions that had no outliers above 5, which had been present in the S4 BlockNormal analysis.

Tuning on S5 Period 2

With &rho2=4 and PE=5 for per-IFO event production, the data from Period 2 (Feb 9-May 2, 2006) was processed using the BlockNormal pipeline. Clustering was done at three higher PE settings of 7.9,8.9 and the target 9.9 to check background rates. This was all done on the lower-bandwidth frequency range (96-1,024 Hz)
Period 2 per-IFO rates
Period 2 trigger rates
These results were additionally processed with the S4 CorrPower package using the same &Gamma and |H1/H2| Ratio cuts (>3 and < 0.3 respectively) used in the S4 analysis to check for outliers.
Period 2 initial outliers
After upgrading CorrPower for S5 DARM_ERR and notch filters, the outliers were greatly reduced
Period 2 after CorrPower upgrade
This was followed by the change to the Combined Power threshold
Period 2 Background with PCMB threshold

Veto Tuning

Once the initial [&rho2,PE] thresholds are set, the veto tuning proceeded following the steps outlined on the S5 Veto Documentation page. This merely repeats work done on the S4 data, but with auxiliary channel choices informed glitches found by the BlockNormal S5 Online processing and ongoing work by the Glitch Group.
The initial veto tuning has been completed for Period 2 per-IFO events using auxiliary channel LSC-POB_I. As documented on the S5 Veto Tuning Results page, the veto performance of this channel could be tuned for all bands and IFOs. Work is now proceeding to check for veto safety of this channel.

Period S5A Analysis

An initial analysis was done on the S5 Period A data (Nov 4, 2005 - May 2, 2006). This would enable comparison with the WaveBurst+CorrPower analysis.

Low-Frequency (0.1-1KHz) Results

The first pass used a limited band-width low-frequency range (96 - 1,024Hz). Results included: During the sensitivity measurement, we determined that an additional parameter (β) had to be added to the sensitivity curve fits to handle the change in slope from the all-sky source distribution.

Full-Bandwidth (0.1-2KHz) Results

Having checked the detection sensitivity and selection cuts, we made another pass where we increased the bandwidth to 96-1980 Hz (prior to CorrPower cuts). These results included: The background rate was 7 triggers in 100 time-lags. Signal livetime was 4,734,151 sec and total background livetime was 448,845,850 sec. When we opened the ``zero-lag box'', there were no events that passed our selection cuts.

Event Separation Cut Study

The analysis of the white-noise burst MDC set over the S5A period showed poor detection efficiency for long-duration (100 ms) bursts. A study was conducted to determine if this could be remedied easily. The results of the Event Separation Cut study showed that merging events that were only separated by < 60 ms and had a small separation compared to the event durations increased the detection sensitivity for these bursts, with little effect on background trigger rates. This change was added to the event clustering step of the pipeline for the S5 First Year analysis.

Simulated Waveforms for Astrophysical Interpretation

To simplify the comparison of sensitivities in strain to those in terms of energy released at the source, we need an appropriate waveform. A band-limited white noise waveform with a cutoff at 2 KHz and duration 1/16 sec was chosen. The BLW_BN_S5 MDC set was created to measure the detection efficiency for the astrophysical determination. Any final tuning will be on this waveform's sensitivity.

S5 First Year Analysis

We have extended the analysis to the S5 First Year time period. So far, there are results on:

Coherent Network Analysis

These techniques are new for the S5 run and are taking some time to understand, validate and integrate into the pipeline.

Maximum Entropy

We have completed the integration of the Maximum Entropy software with the existing pipeline. Work remains to reduce processing time when doing the all-sky scan for optimization

Astrophysical Result

To complete inputs for the astrophysical interpretation, we need to break down the observation time as a function of sidereal time. As seen on the Sidereal Time Distribution web page, this correction can be important for shorter periods, such as S5A. However, when applied to periods as long as S5 First Year, the distribution in sidereal time is essentially flat.

We have taken the background rate, sensitivity curves from the band-limited white noise (BLW_BN_S5) MDC and the sidereal time distribution to produce an exclusion curve on a source distribution modelled on the binary white-dwarf population (See web page).

$Id: S5_astro_burst.html,v 1.4 2007/10/31 13:35:26 kathorne Exp $