IPP Progress Report for the week 2013-01-14 / 2013-01-20

(Up to IPP Progress Reports)

Eugene Magnier

Serge Chastel

  • IPP
    • Added columns to IPP-MOPS-DEV datastore
    • Fixed division by 0 bug in neb-df when a volume is not mounted
    • Modified nebdiskd so that it queries /export instead of /data
    • Added "Theoretical" available space on the cluster, that is, if all hosts were up in nebulous.
  • MOPS
    • Setup MegaCam db so that BB can run the IPP on megacam data
    • Rsynced /data/ipp023.0/ipp/ippRefs/catdir.refcat.20120524.v0 to /export/neohq1.0/ipp_references
    • Got inputs from PV: started synthetic trails implementation.

Heather Flewelling

Mark Huber

  • MD.GR0 -- re-running MD staticsky+skycal stack photometry to replace first version from last Summer with improvements. MD09, 04 ref+deep+night mostly finished, remaining refstacks to finish the following week.
  • TSS nameserver logs filled up ipp003, re-verified setup after Gavin moved web services to /data/ipp003.0
  • slowly moving towards desktop OS reinstall to replace default RH mess that came pre-installed.

Bill Sweeney

  • discovered a serious problem with the sky calibration run processing. psastro was not correctly handling stack inputs. Stacks are set to have a zero point of 25 but the code that psastro used to convert from instrumental magnitude to magnitude for the reference catalog lookup was using the nominal zero points for single frame exposures. This had the result that the selection of stars retrieved from the catalog did not match the stars selected for the astronomical fit. This caused psastro to decide that the data was bad and assigned it bad quality. This did not affect any skycells in our SAS test data sets so we were not aware of this problem. The end result is about 1/3 of our stack data was lost before being imported into DVO and thus PSPS. Since the difference between the single exposure zero point and 25 was largest for y band it was most affected. 2/3 of skycells were rejected. DVO ingest and thus PSPS ingest will need to be restarted.
  • Fixed the psastro bug and started over with the skycal processing. Reran Kepler wedge for dec > 20 degrees. For staticsky deferred the galactic center for now and started working on stripes of RA starting in the east at 10 hours. Comparison of the first two plots at http://svn.pan-starrs.ifa.hawaii.edu/trac/ipp/wiki/staticsky.20120706 shows the progress over the weekend. At this rate staticsky should complete for the region with 9.5 hours < RA < 20 hours by the end of the week. This will provide many options for how we proceed with loading PSPS.
  • Prior to the above spent some time shepherding the data near the galactic plane through staticsky. We have a performance problem for very dense regions.
  • continued making progress on the release management tables. releasetool is now nearly ready for tracking releases and calibrations for single exposures. Should be able to finish stacks quickly.
  • two mostly quiet days as processing czar

Chris Waters

  • Identified issue with scaling of warp backgrounds for stacking. This then led to the realization that due to the PATTERN.CONTINUITY correction, the background models for adjacent chips have possible offsets relative to each other. This suggests that the background models need to have a similar continuity correction applied as well. This will ensure that the models produced by the warps are "smooth," which should in turn eliminate the scaling issues in the generation of a stack background model.
  • Worked with Denver on a new test set of WW diffs for trail fitting. This suggested that ~15% of the expected trails were not being fit, likely due to psfQf issues. As the expected cost of fitting all diff detections with an extended object fit should be small, I'll work on implementing this option.
  • Fixed interpolation issue that was slowing down psphot.

Chris Waters