LSC-Virgo Burst Analysis Working Group

Navigation

Burst Group Home
Wiki
ExtTrig Home
CBC Group Home
LSC, LIGO, Virgo

Documents

'How-to' docs
Agendas/minutes
[2008, 2007, earlier]
Talks/Posters [pre-wiki]
Papers
Paper plans
White papers
ViewCVS

Investigations

Analysis projects
Old notebook [General, S2, S3, S4, S5]
Virgo workarea
External collabs

Review

Main review page
Telecons

Minutes of 2008-Apr-17 S5 QPipeline Review Teleconference

Attendance

Shourov Chatterji, Jonah Kanner, Dave Reitze

Minutes by Dave Reitze.

Announcements

Discussion


1)  How does the review committee deal with the 2nd year analysis?

Several issues may change:  Clustering, Omega pipeline (including scan over
sky grid), Virgo, Significance remapping....

Shourov: plans for second year.  Remove forward rapidly.  Things will
change - remapping/clustering, HL consistency follow up, from WB - use time-
based thresholds, extend Q range.  What is possible by the time for the
September meeting - expected deadline.

Dave - how much does not having these affect yr 1 analysis?
Shourov - time-based thresholds, applying HL consistency tests (to clean up
L1 data) would be priorities.  Getting Omega pipeline (=X + Q pipeline)
online.  There has been good progress on that.

Once re-run of yr 1 is behind us, then focus on getting yr 2 upgrades in
place.

Dave: How much does Virgo complicate things?

S: T-F coincidence test done with H and L could be done with Virgo.  Adding
a third site makes a big difference - solves the inverse problem and create
a new null stream.  Virgo is competitive at high frequencies but not at low
frequencies. There is a separate 'high frequency burst search' going on.

Dave:  Code development to take place?

S: a lot is taking place in the Omega pipeline.

Dave: To be clear, is there going to be just an Omega pipeline run on the
yr 2 data?

S: the old Q-pipeline and the Omega pipeline are running in parallel, Q-
pipe is a backup.

Jonah: would it make sense to use Qpipe and then use Omega as a followup?
Saves time

S: definitely a possibility.

Jonah: good to be talking about this.  Should we have a new review
committee?

S: Dave will most likely be leaving, so would need to have a new reviewer.


Dave: Different enough that I think it's a new review.  Q-pipeline will
change and Omega pipeline is different enough, needs code review, etc...

Dave: September is the date for opening the box on the 2nd year search?

Jonah and Shourov: that's the plan.

Does this review continue, is a "new" review begun (possibly with some of
the
same committee members), other scenarios....

2) Do we understand the issues that had come up with the whitening filter?

S: partial documentation available - will point you to.  One piece of
analysis that needs to be corrected is how the coherent energy connects to
whitening.  Coherent can go negative.  Calculating correctly will be done
for Omega.

Do we understand how those issues were dealt with?  Do we understand how
H1H2 coherent/null streams are actually computed?

S: A significant null stream event vetoes an area in the t-f plane, the
bigger the NS, the larger the area.

J: Concerned mostly about documentation - see things explicitly written
out.

3)  Do we understand the "inflation factor" for null stream veto?



4)  Were there any bugs when opening the box?  Right segments, DQ flags,
network, non-clustered triggers, lag ?

S: Thinks segments are all correct.  Laura produced the same segment list
as S did using different scripts, except a few discrepancies in the DQ1
flags - didn't include some DQ1 flags since weren't fully assigned, in
particular 'H1H2 loss lock' flag.  cWB had the same problem.  99.999% sure
that will have no impact.

Network - no issues

Non-clustered triggers - no effort to remove redundant triggers; could over
count triggers.

J: when you do 0-lag production, are they the most significant triggers on
a 1 second time scale?

S: all triggers come out.

J: So what is the meaning of a 1 second time scale?  To count events?

S: Yes.

J: OK, logic makes sense.  Motivation is that there had to be scripts to
open the box; have they been reviewed?

S: Historically, not done in the burst group.

J: Surely, some of the post-processing code has been reviewed, right?

S: No....  Can be done using black box validation to check output.

D: This should be done.  Why not in done the past?  How much time/manpower
needed?

S: roughly a week's time.  Not done in the past because limited resources
and always focused on search algorithms.

J: Should at least do sanity checks.

S: need to think about what tests, but yes should be done.

5)  Is there an idea of how the H1H2 coherent/null stream analysis compares
with an H1H2 coincidence analysis?

S: Have H1 and H2 individuals that could be used in a coincidence analysis,
but don't have simulated triggers.

J: Really, asking what motivated choice of H1H2 coherent/null stream
consistency vs H1H2 coincidence analysis?

S: Null stream gives better consistency - motivation is pretty clear.

6)  Is there a "physical" explanation for the funny shape of the three IFO
cuts

Dave: Need to look at the scatter plots!

S:  Vertical cut is a threshold on H1H2 energies. Angle cut in HL plane -
if you had ideal white noise in all detectors, the ideal cut would be a
straight line in H-L plane.  Lines of constant E_H +E_L = constant, sort of
contours of constant likelihood.  E_H and E_L are energies; E_H is coherent
energy. In reality, not ideal noise, so the dotted line is empirical cut -
H1H2 cleaned up by null stream, but can't clean up L1.   Can motivate a
diagonal cut on first principles, but at the end of day, an empirical
choice is made based on data quality.

Next week - Dave is at a meeting, but should be able to call in.

Action Items

$Id: minutes_20080417.html,v 1.1 2008/04/24 17:20:06 leonor Exp $