Observing Operations | Reviews | Survey Management

Sloan Digital Sky Survey
Review of Observing Systems and Survey Operations

Science Requirements and Commissioning
Jim Gunn and Michael Strauss
April 13, 2000

I. SDSS SCIENCE REQUIREMENTS

 In order to confirm that the SDSS hardware and software can realize the science goals of the project, as laid out in the Principles of Operation, it is necessary to rephrase these goals in terms of quantifiable scientific requirements on the systems, for which specific tests can be carried out to confirm that we are meeting these requirements. In outline, these requirements are as follows:

A. Image Quality and Throughput: the PSF of the Imaging Camera as a Function of the Free-air Seeing and Position on the focal Plane.

In brief, these are designed to give a PSF FWHM of 1.1" at the edges of the focal plane in the median free-air seeing of the site (0.8"), and somewhat better on-axis. We are required to reach 1.2 arcseconds for acceptable image data. In addition, the throughput and camera read-noise should be such as to yield these magnitude limits for 5 sigma detections for stellar objects:

u'=22.3, g'= 23.3, r'=23.1, i'=22.3, z'=20.8

B. Photometric Accuracy:

 Our requirements state a photometric calibration accuracy for bright stars of 0.02 mag in r', 0.02 mag in the g'-r' and r'-i' colors, and 0.03 mag in u'-g' and i'-z'. We have developed an 11-item error budget for this, which includes the uncertainties from the primary standard star system, the photometric solution for each night, the transformation from the PT to the 2.5m system, any nonlinearities in either system, and non-uniformities due to varying PSF across the 2.5m imaging camera.

C. Astrometric Accuracy:

 These are driven primarily by the need to drill holes for spectroscopic targets; the requirement is 180 mill-arcsec (mas) rms errors per coordinate. Astrometric science goals have motivated us to set an enhanced requirement of 100 mas.

D. Spectrographs:

 These include requirements on throughput, resolution, signal-to-noise ratio, and redshift accuracy and reproducibility.

E. Target Selection:

 These put requirements on the uniformity, reproducibility, and completeness of our three main categories of spectroscopic targets: the main galaxy sample, the Bright Red Galaxy sample, and the quasars. These requirements are driven by the large-scale structure science goals of the SDSS, and the desire to use the fibers as efficiently as possible.

F. Operations:

These requirements are motivated by the need for efficient operations on the mountain, and sufficiently fast turn-around of data reduction to keep up with the data flow, and to flag any data that do not meet survey specifications.

 

II. SCIENCE COMMISSIONING

 Science commissioning refers to the process whereby we exercise the system in all its aspects, to confirm that it is able to meet the science requirements of the project. The data that we have gathered in these commissioning activities have proven to be tremendously valuable scientifically; although little of it meets our strict survey quality requirements, we have used it to discover the largest sample yet of high-redshift quasars and the coolest substellar objects known, to map the distribution of RR Lyrae stars in the halo of our Galaxy, and to map the distribution of dark matter around other galaxies via gravitational lensing.

Scientific commissioning consists of obtaining data with the SDSS telescopes and instruments as they are tuned, analyzing it with our software, and using the results to diagnose problems, and to check against the scientific requirements. The data gathered thus far include more than 1000 square degrees of imaging data, together with calibration patches from the Photometric Telescope, and 30 spectroscopic plates of 640 spectra each. In brief, the status of the requirements are as follows:

A. Image Quality and Throughput:

 We have seen images as good as 0.75" FWHM on the focal plane, but unfortunately, the uniformity of the point spread function across the focal plane has been substantially worse than required. The seeing at the site is also not as good on average as we had hoped/expected. We know at this point that we have some trouble with the telescope optics, and may have trouble with locally induced seeing degradation, perhaps accompanied by worse-than expected thermal distortion of the mirrors. The bottom line is that we are having difficulty meeting the 1.2 arcsecond spec on worst-case image quality across the array; we have seen it, but the conditions which produce it are not frequent enough for us to proceed. We are considering backing off to 1.5 arcseconds worst case, at least temporarily. We think that this has essentially no effect on target selection for the spectroscopic sample, but this must be verified.

Preliminary measurements indicate that the throughput of the telescope and camera are better than 80\% of the design values in all chips, and the read noise in all chips is acceptably low.

B. Photometric Accuracy:

The stringent 2 percent accuracy in photometric calibration will probably only be reached at the END of the survey, when we have five further years of observations of our photometric standards. At this writing, the photometric standard star system has been defined to an accuracy of 0.015 mag in all bands except u’, where the internal accuracy is somewhat worse (.03). Extending this accuracy to the patches used to calibrate the camera data has proven difficult, due primarily to flat-fielding problems on the Photometric Telescope, which have been our largest source of systematic error. We believe now that we have all the (dismayingly many) problems with the flats solved, but we must check carefully. Another source of systematic error is the variation in the width of the PSF across single chips in the 2.5m imaging camera; uncorrected, this can cause systematic errors in stellar photometry of up to 10%, but the Photometric Pipeline now corrects! for this effect to the 1-2% level.

C. Astrometric Accuracy:

 Scans both with a parked telescope on the Celestial Equator, and powered scans off the Equator, show rms residuals of the astrometric solutions of better than 150 mas in each coordinate, comfortably within the requirements. Most of the remaining systematic errors are due to rapid fluctuations in the atmosphere (so called "anomalous refraction"). We are investigating ways to model these fluctuations better, and therefore to reduce these residuals. Recent work comparing astrometric data of our own from two different runs and comparing our astrometry with that of the 2MASS infrared survey, which claims 100 mas per coordinate astrometric errors, shows that we do somewhat better on the sky than the astrometric solutions indicate, and that we are already at the 100 mas error level. We are still working to improve the situation further, and have several promising avenues.

D. Spectrographs:
 
The spectrographs perform extremely well. The measured throughput of the atmosphere, telescope, fibers, and spectrographs peaks at 25 percent in the r' band and 18 % in the g' band, which meets specifications. The resolution of the spectrographs also meets design specifications and the wavelength coverage of 3850--9200 A exceeds specifications. Wavelength calibration is at a level of 0.08 A rms deviation for sky lines and the overall zero point velocity error is on the order of 10 km/s (exceeding the requirements). Work is on-going on the spectroscopic pipeline to reduce these data, but it is approaching the goals of correctly classifying and measuring redshifts for 95 percent of targeted galaxies and quasars (currently 93% of galaxies are assigned high confidence redshifts and 80% of QSOs).

E. Target Selection:

The excellent performance of the spectrographs means that the spectra of essentially all of the galaxy and quasar candidates have adequate signal-to-noise ratio in a 45-minute exposure to allow a classification and redshift measurement. The quasar target selection algorithm has been demonstrated to meet its goals of selecting 90 percent of known quasars in the region imaged, allowing a false positive rate of less than 35 percent. The selection of the Bright Red Galaxy sample is yielding objects in exactly the redshift and luminosity range targeted, and targeted objects in the main galaxy sample have the expected redshift range and luminosity function. An important to-do is to carry out target selection for multiple imaging scans of a given region of sky, to quantify the reproducibility of the Photometric Pipeline outputs, and the targeted lists. This is especially important if we are forced to relax the seeing/image quality requirements.

F. Operations:

 Work is on-going to improve the efficiency of the observing procedures, especially the switching between imaging and spectroscopic modes and the time between spectroscopic fields. Software tools are being written to determine which mode to operate in at any given time, and to decide which area of sky to observe. The data reduction pipelines operate within the CPU and memory requirements set out for the survey, and work is on-going to get the Fermilab production system up to speed. Finally, Quality Assurance tools are being written to determine the acceptability of any given chunk of data, both preliminarily on the mountain, and definitively once the data have been reduced. As of this writing the system is working, producing data, and reducing it with acceptable speed but the efficiency must be improved by a factor of several in order to finish the survey within the allotted time and budget. Most of the factors which reduce efficiency have been ! identified and in most cases it is clear how to deal with them. The solution in many cases is simply practice and training, but in others quite clearly new tools will have to be developed, but there appear to be no problems of principle.

  



Review of Observing Systems and Survey Operations
Apache Point Observatory
April 25-27, 2000