IPP Progress Report for the week 2011.04.18 - 2011.04.22

(Up to IPP Progress Reports)

Eugene Magnier

I spent the week looking at minutia of psphot to understand some of the issues Nigel and Michael have raised recently:

  • I've set the minimum Kron radius to that appropriate to the bright stars for an image; this should address the undershoot of the kron mags for some faint galaxies.
  • I've set a maximum Kron radius based on the window (based in turn on the isophotal footprint) -- some faint sources were getting absurd values
  • I've addressed some errors with the PSF_QF and PSF_QF_PERFECT masking for stacks (but an error remains: the stacks are missing certain bits in the 'bad' mask bit set -- TBD).
  • I've updated the PSF I/O function save and load both the aperture correction model and the growth curve. I've also adjusted the photometry code to apply both corrections as appropriate to the psf-related mags and fluxes. The growth curve needs to be applied consistently to both output of the straight and the forced photometry, so I needed to add that to the PSF model.
  • I've updated the curve of growth analysis to use the bright stars (used for the psf model); I had been cheating by using the psf model itself to judge the growth correction. some of my tests show that this was resulting in an error of about 1% for the reference aperture normally used in the analysis (25 pixels)
  • I have NOT changed the config files, but I have concluded that we need to move from 25 pixels for the reference aperture to something larger -- probably 35 pixels for the single image analysis. This conclusion is driven by trends I see in the zero points in the MD fields that correlated with the FWHM, but corroborated by the fact that the curve of growth still has ~1-1.5% more to go at 25 pixels, depending on the seeing. I need to run a set of tests to confirm that bumping up the reference aperture fixes the apparent zero point trend.
  • the aperture magnitudes are measured in an interpolated image shifted to be centered in the circular mask; I found an error in the interpolation used for this shift. i changed the interpolation method to bilinear, which did not have the error (note that only this aperture interpolation uses that particular interpolation mode, so there are no other implied errors). The error is small for relatively large apertures, but for the smallest apertures used in the curve-of-growth analysis, it introduced some noise. I don't think anyone will notice the impact of this particular fix...

Just to be clear: there are 2 different corrections which are applied:

  • the psf magnitude is measured for the psf model and integrated to infinity (actually, 50 sigma). this is the raw PSF magnitude
  • the aperture magnitude is measured in a modest-sized aperture. this aperture is set based on the FWHM, and is typically in the range 8 - 15 pixels. This aperture is meant to be small so that neighbor confusion is minimal, but this implies that they will lose a fraction of the light of the object. this is the AP_MAG_RAW value reported in the CMF file.
  • the aperture magnitudes are corrected via the curve of growth from the measurement aperture to the reference aperture. This is meant to be large enough to capture essentially all light regardless of seeing (or at least to lose a fixed amount and be insensitive to seeing). 25 pixels is apparently to small to meet this goal. This is the AP_MAG value reported in the CMF file.
  • the psf model magnitudes are corrected for the mean difference between the aperture flux and the raw psf model. This correction is measured as a 2D map across the image. Note that this correction is measured relative to the *growth corrected* aperture mags. Regarding the '2D map': values are interpolated to positions within the image based on a coarse gridded image; the scale of the map is set dynamically by the number of stars to have ~3 stars per cell (probably too small -- but I have not yet bumped this up.

Finally, I've added a function to psphot to measure the kron magnitudes with the other source models subtracted. With the min and max radii set above, I think this cleans up a lot of the bad kron mags, but I'd like to have feedback on that point; I've asked Nigel & Michael to re-run their tests.

Serge Chastel

  • MOPS czar
  • IPP czar
  • Update of IPP-MOPS ICD: ICD Lite wikipage now contains the ICD pdf and nothing else...
  • Documenting and fixing hemocrat
  • MOPS detection efficiency
  • bzip instead of gzip for MySQL dump archives of gpc1, ippadmin, and ippRequestServer. E-mails are now sent to ps-ipp-ops@…

Heather Flewelling

  • started rsync of ThreePi? db to ipp003 - it stalled out at some point because ipp003 got rebooted, and I restarted it. Not sure if it is complete yet.
  • addstar -resorted, relphot'd, and merged the 3 MD04 databases for Roy. (/data/ipp005.0/gpc1/catdirs/MD04.merges/MD04.merge)
  • fixed a couple things on ippMonitor (add*)
  • czar monday
  • sick wed and thursday
  • state holiday friday

Roy Henderson

Lots of progress, and a major setback, getting stacks into PSPS this week. Still lots to do before Boston.

* Finished remaining development work for stack batches:

  • questions about how to populate PSPS flux columns: Jim and Gene settled on a solution and I implemented it in code
  • numerous schema changes
  • work on GPC1 queries to list available stacks, get contributing OTAs for each
  • new temp Db table to store meta data from DVO, eg flags, photcode etc
  • redesigned ippToPsps database to keep track of batch types, processing, whether they are loaded or merged etc
  • fixed issue of NaNs in fits headers not writing to database
  • changes to lots of SQL due to Heather's changes in gpc1 addRun table
  • now performing clean-up in Db before export to FITS, i.e. removing NULL objIDs, NULL fluxes etc

* Successfully tested loading and merging of new stack batches. Problems encountered:

  • missing tables: SkinnyObject, ObjectCalColor in stack batches. Added these to code.
  • NULLs sneaking in to new 'updated' column in Object table: we had forgotten to add a default. Changed schema
  • crazy -999 "NULL"s don't fit in byte field and so broke load. Fixed
  • merge stage was expecting all contributing images for the stack to be loaded already. Sue fixed this.

* DVO speed issue

  • encountered a serious DVO speed issue with the new MD04 database. Way too slow for us to load enough before Boston
  • formed plan to pull everything we need into a MySQL database for quicker, easier access
  • started work coding this

* the solution to the above DVO problem means my new code needs to produce detections, so:

  • lots of work filling-in the last missing fields
  • porting GPC1 access methods in Jython code
  • laboriously comparing the output with that of the old code to verify nothing has been lost along the way

Bill Sweeney

  • Bill spent most of this week finding a workaround to the bright star astrometry problem for the STS exposures. This involved:
    • studying the addstar implementation getting familiar with how it works.
    • changing the code to (optionally) use different sets of stars for the initial grid search and final fitting
    • Tested using the usual range for grid search and for the fit using stars within a magnitude range where the errors in the astrometry are small.
    • Ran tests of the results in dvo. Created a wiki page showing the results for our test exposures.
    • tested that new code is not use with the usual recipes and integrated into the production tag
    • Started reprocessing the STS.2010 data using the new code.
  • investigated problem with queueing staticsky runs. Proposed a solution, but got no feedback yet.
  • One day and part of the weekend as processing czar.
  • Investigated and fixed several problems with the cluster that unfortunately occurred over the weekend.
  • searched for and queued data for cleanup.

Chris Waters

  • Reprocessing/large area processing
    • Confirmed that even when we have a smaller number of inputs, pairwise diffs still returns better magic results than diffing against a quickstack.
    • Began design work on large area processing tool and scripts. This should manage all the processing for exposures with minimal manual intervention.
  • Astrometry: Processed second microtest data. Details available here: http://svn.pan-starrs.ifa.hawaii.edu/trac/ipp/wiki/GPC1_BrightStarAstrometry_SW_LH_scan
  • Started work getting Nebulous to see new large storage hosts at the ATRC.