IPP Progress Report for the week 2009.11.02 - 2009.11.06

(Up to IPP Progress Reports)

Eugene Magnier

We had two big accomplishments this week: the first was the acceptance by the Air Force of the IPP + Magic automatic processing and data release (excluding y-band). This is extremely important as the manual inspection previously required of the Magic results was the primary block for fully automated processing with the IPP. Enabled by this acceptance, we performed the second big accomplishment: a full-scale throughput demonstration of a night's worth of data (see Throughput Demo 2009.11.04). Although there were some hiccups and issues identified during the test, in the end we were able to achieve a throughput equivalent to the processing of 600 exposures in a 15 hour period. We also smoothed out a number of rough edges to the automatic processing, and identified a few bottlenecks that can help us improve the throughput further.

I spent much of the week working on this throughput test, though I had a bit of time to work with Chris on the difference image photometry. We have concluded that we must modify the variance for source fitting to match the expected mean chi-square, but we are still somewhat unclear why this is so. This is a poorly understood (at least by us!) effect of attempting to do chi-square fitting in the presence of correlated errors. Chris showed that the variance agrees with our expectation for per pixel noise, that the photometry in apertures agrees with our expectations if we include the information in the covariance matrix, but we still find unexpected chi-square distributions for model fits in the faint source regime (in the bright source regime, we expect to have chi-square values which are large due to the difference between the PSF model and the real object shapes).

I've also made some progress with the astrometry catastrophes which occur in certain observations. I examined the particular case of M42, where the cross-correlation analysis is fooled by the clustering of sources in the bright nebulosity. We have code which removes extremely dense clumps from the input detection lists for this reason, but it was necessary to iterate over this process and also to reject the dense clumps from the linear fitting stage of the analysis. I believe that this particular issue, and some related outlier rejections, are responsible for more of the catastrophic astrometry failures we have seen, and I will be continuing to pursue this next week.

Heather Flewelling

  • investigating more failure modes of chips and such
  • rerunning chips, warps, etc that failed previously
  • added ipp019 into nebulous
  • edited ~ipp/stdscience/input file to add back in nodes for processing
  • watched throughput test. I wrote a useful script to 'watch' the label for faults (instead of ippMonitor), which prints out faults in all stages and the uri (to get to the log). It also counts the number in 'new' (a good way to see how much is left to do). I need to make it more useful (for more labels).
  • out sick Friday

Bill Giebink

  • Jury duty Monday
  • Spent a small amount of time on realtest.
  • Did further study and preparation for starting DB replication.
  • Got Josh's arclog.pl script running to collect Areca RAID card logs from nodes.
  • Started parsing of logs to isolate read errors.
  • RMA'd two boxes of WD drives.
  • Started looking into buildtest and psLib TAP test failures.

Paul Price

  • Dual convolution (branches/pap)
    • Merged into trunk (rr25999,26000)
    • Ran on 4 diffs from ThreePi?_SouthernRegion with DUAL=T, SPATIAL.ORDER=0. Some strange results for a small subset of skycells; otherwise good.
    • Some bad subtractions, clearly identifiable with chi2 > 100! One has 25453.6! Cannot fix merely by increasing ITER.
    • Softening errors produces nice behaviour --- fixed!
    • Some sources show up in the subtraction that should be masked --- fixed.
    • Happy with dual convolution again, pending testing on a few exposures.
  • Stacking (branches/pap): investigating rejection behaviour
    • Completed rework of combinePixels
    • On each rejection pass, throw out only the most variant pixel; change iterations parameter to iterations per input.
    • Happy with behaviour of stacking (r26011):
      • 4 inputs from stack_id = 14006 (the Draper stack)
      • stack_id = 25532 (full Draper stack from more recent warps)
    • Would benefit from threading the creation of the fake images
  • Moved TEMP.DIR from ppStack recipe to site configuration; updated production version

Bill Sweeney

  • Modified the distribution tasks to spread the bundles over multiple hosts to improve performance.
  • Split up distribution bundle processing script into two pieces so that it can be used by the postage stamp server for 'get_image' requests.
  • Worked with Gene analyzing faults and fixing problems uncovered in the throughput tests. (some of these problems were introduced by the previous two tasks. Sorry..
  • Fixed a couple of postage stamp bugs reported by Ken Smith.
  • Spent some time doing research into techonologies for creating the real web interface to the postage stamp server
  • Ran simtest and fixed some bugs found there.
  • Looked at the results of the backround restore program. Found out that compression is not appropriate for the background model file. Turned it off in the file rules.
  • Marked trac tickets assigned to bill that have been fixed to fixed.

Chris Waters

  • Artifacts: Looked at the list of "other" artifacts sent around by Armin, with the breakdown being:
         Burntool artifact from very bright star:   10
         Burntool oversubtraction:                   5
         Burntool unremoved:                         6
         Oversubtraction near very bright star:      6
         Unmasked donut:                             2
  • chi2/quoted errors in photometry: Wrote code to generated simulated images, and then gather statistics about the chi2 and magnitude errors from the PSF fitting. Wrote code to measure the scatter in the background noise as a function of smoothing size. Considered how smoothing decreases this scatter, but was unable to get my math to accurately correct this.
  • Short week due to vacation time.