Numerical Relativity and Data Analysis:
Nov 6-7, 2006 @ MIT, Boston


Registration & Travel
LSC Meeting

Mon 06 Nov 2006 10:41:30 AM CST

Mark Miller

Harald Pfeiffer gave a presentation about effects of eccentricity in

John Friedman gave a brief statement about how helical KV solutions
might be useful.

Patrick Sutton asked about how this might be used in the current
numerical simulations.  John thinks that this gets used by gradually
turning retarded solution in these simulations. 

John Friedman commented that one should look for second order
convergence.  But asking if the solution is actually a solution,  one
should simply look at the accuracy with which it satisfies the
Einstein equations.  

If you are asking if you are looking at a solution to Einstein's

There was a lot of discussion of Mark's suggested norm.  People did
not know how to say what a particular value means. 

Sergey mentioned that data analysts are interested in other accurate
properties of spacetimes and simulations.  Are you looking at
something like that?  

Luis responded by saying that the aggregate details are being looked
at.  Until there are more simulations.

Sergey wanted to see some more broad based information extraction.  

Finding errors numerically should be done with every quantity.

Michele Zanolin mentioned that people compute numerically derivatives
of the field with respect to the initial data.

Luciano reminded people that there is consistency and convergence.  If
you have convergence, there is consistency.

Frans said that he thinks asking what is delta h is important,  but
the question is what could be done to provide a delta h?  Mark was
thinking delta h due to each error.

Pablo presented a short note about accuracy.  Comparing waveforms as a
form of evaluation of the differences between the waveforms.  He had
one plot with waveforms from PSU, Uli, NASA, UTB.  Waveforms look very

Pablo encouraged people to keep putting their waveforms into NRwaves