This MD04 stack is currently available at

http://datastore.ipp.ifa.hawaii.edu/ps1-md/

data_group: MD04.refstack.20100610

Creation of the MD04 Reference Stacks -- 2010.06.10

(up to IPP for PS1)

This page documents the process of creating the first set of deep MD04 reference stacks.

I generated a set of reference stacks for MD04 from the data taken in late 2009 / early 2010. For this analysis, I first generated a reference photometry and astrometry database, and then used that database to reprocess the images that went into the stack. The expectation is that the improved astrometry should result in better registration of the images and thus a better stack, while the improved photometry should allow for a deep stack which is correctly calibrated relative to the rest of the PS1 system.

Data Available for MD04

PS1 has obtained 1204 exposures of MD04. Of these exposures, 70 were from 2008, and were ignored for this analysis. This left a total of 1134 exposures which could potentially be used, all taken between 2009.11.26 and 2010.05.05. For the basic reference stacks, as discussed below, I only included images taken in photometric conditions with seeing better than some nominal limit.

Generation of the reference database

Selection of the input images

To build the reference database, I selected a subset of the exposures which had already been processed and for which magic had been applied, It turns out that a number of the images (those from before Demo Month) have not been individually magicked -- only the nightly stacks have been distributed to date. In addition, we never created a y-band reference stack, so no y-band images have been processed through diff. These two restrictions resulted in a total of 295 images available for the reference photometry database in griz. The sql used to select the images can be found below. I made a separate y-band only database, accepting images which had not been magicked.

select exp_name, dateobs, exp_id, max(cam_id), filter, camProcessedExp.path_base
from camRun 
join camProcessedExp using (cam_id)
join chipRun using (chip_id) 
join rawExp using (exp_id) 
where comment like 'MD04%'
and camRun.magicked > 0 
group by exp_id
order by dateobs

Ingesting the images

The selected images were ingested into 2 dvo databases (one for griz and a second for y-band). Below, I give the addstar commands used for the ingest. In this case, the 'NOMINAL' zero point was applied; this means that the on-the-fly calibrations were ignored and the expected zero points were used.

foreach file (`cat smffiles.list`)
  set realname = `neb-locate -p $file`
  addstar -D PHOTCODE_FILE dvo.photcode.grizyJHK -D ZERO_POINT_OPTION NOMINAL -D SKY_DEPTH 4 -D CAMERA gpc1 -D CATDIR $catdir -update $realname -use-name $file
end

After running addstar on all of the input images, as well as the addstar 'resort' step needed to update the table indexes, I ingested the 2mass data for these regions so that any astrometry analysis performed with this database could constrain images near the edge of the field. I then ran basic averaging of astrometric and photometric properties (this last step is somewhat redundant with the relphot and relastro analysis below and could be skipped). The commands for this analysis are given below (note the restriction to the region of interest around MD04).

set region = "-region 147.0 152.0 0.0 4.2"
addstar -D CATDIR $catdir -resort $region
load2mass -v -D CATDIR $catdir -existing-regions $region
relphot -v -D GRID_TOOFEW 10 -D MOSAICNAME GPC1 -D CATDIR $catdir -averages -update -reset -statmode WT_MEAN $region
relastro -v -D GRID_TOOFEW 10 -D MOSAICNAME GPC1 -D CATDIR $catdir -update-objects -update $region

Relative Photometry and Astrometry

After the databases were created, I ran the full-scale relative astrometry and photometry analysis on them. The relative photometry analysis currently runs separately for each target average photcode (grizy). The analysis determines the average magnitudes of the stars in the images, based on the available measurements, then uses these averages to determine the relative offsets of the image zero points. It stabilizes the system of equations by re-setting the photometric clusters of images to 0.0. The program performs a requested number of iterations on the photometry. As the analysis proceeds, the code attempts to identify outlier detections for individual stars, then outlier stars (variable stars) from the sample, and finally outlier images -- those with excessively large scatter (ie, poor photometric conditions). As these outlier / poor entries are discovered, they are excluded from the constraints on the average magnitudes. I ran 20 iterations of relative photometry, using the commands listed below. Note that only the high signal-to-noise measurements were used to determine the relative photometry (SIGMA_LIM), and that the zero points were calculated for the full GPC1 exposures, with the individual chips held fixed relative to one another (-imfreeze -mosaic). The -statmode option specifies how the averaging is performed; in this case, the analysis performs a weighted mean of the inner 50% of the available measurements.

foreach filter (g r i z y)
  relphot $region -v -D GRID_TOOFEW 10 -D MOSAICNAME GPC1 -D SIGMA_LIM 0.025 -D CATDIR $catdir -update -statmode INNER_WTMEAN -nloop 20 -imfreeze -mosaic $filter
end

The relative astrometry analysis currently performs only one stage of the analysis at a time. Either the average positions of objects in the database are calculated based on the current image calibration parameters, or the image calibrations are re-measured based on the average positions of the objects in the database. Like the relative photometry analysis, this process requires a series of iterations to converge on a good solution. In this example, I ran 4 iterations. Currently, relastro does not perform a very robust outlier rejection. The commands used are listed below (repeated 4 times in the actual analysis):

relastro $region -v -D GRID_TOOFEW 10 -D MOSAICNAME GPC1 -D CATDIR $catdir -update-objects -update
relastro $region -v -D GRID_TOOFEW 10 -D MOSAICNAME GPC1 -D CATDIR $catdir -update-chips -update

Photometry and Astrometry Accuracy

(show some figures here of the residuals)

High-Quality Database

Once the relative photometry and astrometry were calculated, I created subset high-quality photometry databases for both griz and y-band. These databases are restricted to only the stars with many repeated, high-accuracy measurements. By creating subset databases like this, the eventual on-the-fly calibrations using these databases will not be slowed by needed to load large numbers of sources which do not contributed much to the quality of the calibration. The following 'photdbc' commands extract the high-quality data. Note that the griz database, with 4x as many visits, has a higher requirement on the minimum number of measurements for a high-quality source than the y-band database.

photdbc $region -D CATDIR catdir.v1 -D PHOTDBC_JOIN_RADIUS 1.0 -D AVE_SIGMA_LIM 0.025 -D NMEAS_MIN 40 catdir.ref.griz
photdbc $region -D CATDIR catdir.yband -D PHOTDBC_JOIN_RADIUS 1.0 -D AVE_SIGMA_LIM 0.025 -D NMEAS_MIN 10 catdir.ref.yband

Finally, I merged the griz and y-band database at this point:

rsync -auv catdir.ref.griz/ catdir.ref/
dvomerge catdir.ref.yband into catdir.ref

Photometric Calibration

The relative photometry analysis does not tie the photometry to any specific system. The overall zero point is a free parameter which we can independently set. I measured the zero point for this field relative to SDSS for all bands, including y-band as discussed below. To do this, I created a copy of the reference database and then ingested SDSS tsObj files for the regions into the database. I only included a fraction of the SDSS fields, but enough to cover most of the area. The commands to perform the ingest are:

foreach f (md04.sdss/tsObj-*.fit)
  addstar -D CAMERA sdssmosaic -D CATDIR $catdir -update $f
end

Next, I extract the sdss measurements and the average magnitudes in the equivalent average photcode to determine the zero point offsets between the initial calibration and the SDSS magnitude system. Here is the mextract command (cycle photcode value: g,r,u,z,y = 1056,1057,1058,1059,1060).

mextract ra, dec, mag, mag:ave where (photcode == 1056)

I then supplied this offset to the database by editing the dvo photcode table, extracted with photcode-table -export (file) -D CATDIR (catdir) and imported with photcode-table -import (file) -D CATDIR (catdir). Changing the photcode table immediately changes the interpretation of the measurement magnitudes, since the calibrated values are calculated from the instrumental magnitudes on the fly. For the average magnitudes, however, it is necessary to recalculate the averages. There is not currently a tool to update the average magnitudes and the photcode table in a single consistent way (though this is a reasonable tool to create). It should have been possible to supply this offset to the high-quality photometry database and update the averages based on only that subset of the data. However, to be sure everything was consistent, I simply updated the full-scale databases, and then recreated the high-quality ones from the original databases.

Processing the Images

Once the MD04 reference dvo database was available, it was now possible to use it for the data processing. I copied the database to the Maui cluster (actually a copy on several of machines to reduce the loading), and created an alternate reduction class in which I defined a psastro recipe with the new reference database as the calibration source. The way this is currently defined in the recipe system is a bit cumbersome; we need to consider ways of making this more flexible. One particular inconvenience comes from the definition of the average photcodes. The psastro recipe needs to associate the filter being processed with a specific average photcode from the reference database. The synthetic grizy database uses average names of the form g_SYNTH, and these are listed in the default recipe for gpc1 as a MULTI block. With our current metadata parsing, it is not possible to have a recipe which overrides the values in the MULTI block; new values simply supplement the set of existing values, leading to confusion. To make the system work for now, I changed the new reference db average photcodes to have the same form as the synthetic db.

I queued all of the 1134 relevant exposures of MD04 for re-processing using the new database. Some of these images, those from 3 nights in November and December 2009, have not had the current version of burntool run on the images. For now, I have simply deferred these images. The rest were processed to the warp stage. To generate the stack, I selected the images with good photometry and acceptable seeing. I used limits of grizy = (6.0, 5.5, 5.0, 5.0, 5.0) pixels for the maximum accepted FWHM (major axis, PSF model version) and grizy = (24.55, 24.70, 24.55, 24.20, 23.25) as the minimum allowed zero point for the images. (Need a plot of the on the fly zero points). The on-the-fly astrometry was quite good, typically at the 15mas level. The above choice of limits resulted in roughly 100 exposures per filter available for the stack.