ADASS XII Conference | ||||
|
|
Data Management
O4.1 Data Management for the LSST (Invited)
- Andrew Connolly (Univ. of Pittsburgh)
The Large-aperture Synoptic Survey Telescope (LSST) represents the next generation of wide-field survey telescopes. With repeated scans of the sky on timescales ranging from a few minutes to several years it will be one of the first of the wide-area survey facilities to open up the study of the time domain in astrophysics. The scientific returns from such an approach are numerous, ranging from the dynamics of near-Earth asteroids and Kuiper-belt objects through to the detection of intermediate and high redshift supernovae. We will discuss here the impact that such a diverse range of scientific questions will have on the analysis and management of the data flow from such a telescope. We will focus on a number of areas that must be addressed in order for the successful operation of the LSST. These will include: a) the computational challenge presented by a data rate in excess of 300GB per hour b) the impact of the optical design on the photometric accuracy and image quality of the system together with how this translates into implementing efficient techniques for measuring the differences between repeated observations or for co-adding multiple images to construct a deep image of the sky c) the impact on the design of the software due to the requirement that we be able to detect variability over a very broad range of temporal scales (from almost real-time though to several years). Throughout we will discuss the current state-of-the-art in analysis software and algorithms and how they might be expected to scale with the increase in computational resources over the coming decade. We will then identify which computational data management challenges we must address in the near future.
O4.2 Microlensing Surveys: Exploring the Time Domain (Invited)
- Kem Cook (LLNL) and the MACHO Collaboration
In the last decade a number of different projects have been mounted to detect and follow the progress of gravitational microlensing by compact objects, an extremely rare event. These projects, driven by the need to monitor millions of potential source stars, have opened the time domain in wide-field time-domain astronomy. One of the original projects was the MACHO Project, a survey to determine whether there is significant baryonic component to the dark matter in the halo of the Milky Way.
The MACHO Project collected 8 years and 7.3 Tbyte of data on 99 square degrees toward the Magellanic Clouds and the bulge of the Milky Way. Half square degree fields were sampled, simultaneously in two bands, roughly every three days and light curves for about 55 million stars to a depth of about magnitude 21 have been collected in a photometry database. This database has been analyzed for microlensing and about 500 events toward the Bulge and about two dozen toward the Magellanic Clouds have been detected. We have also identified about 500,000 variable light stars. These have been analyzed yielding new results in the astrophysics of pulsating stars, new categories of stellar variability, and such disparate detections as new high proper motion stars and new quasars.
I will present, from an astronomer's perspective, some of the data management issues encountered in the MACHO Project, a survey which pushed the boundaries of available technology. I will also recount some of the lesson's learned from MACHO's and other microlensing survey's experience in data mining in, and providing public access to, large image and photometry databases.O4.3 The Raptor Real-Time Processing Architecture
- Mark Galassi, D. Starr, K. Borozdin, D. Casperson, K. McGowan, W. T. Vestrand, R. White, P. Wozniak, J. Wren (Los Alamos National Laboratory)
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor "fovea" cameras and spectrometer to the location of the optical transient. Any application of real-time search and time-domain mapping of the sky is possible with Raptor, including the very interesting real-time search for orphan optical counterparts of Gamma Ray Bursts.
The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback...) is implemented with a "component" approach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system.
Finally: the Raptor architecture is entirely based on free software (sometimes referred to as "open source" software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.O4.4 The Subaru Telescope Software Trinity System (Invited)
- Ryusuke Ogasawara (Subaru Telescope, NAOJ) Yoshihiro Chikada (Radio Astronomy Division, NAOJ) Yasuhide Ishihara (Fujitsu Ltd.) Atsushi Kawai (Fujitsu America Incorporation) Kenji Kawarai (Fujitsu Ltd.) George Kosugi (Subaru Telescope, NAOJ) Yoshihiko Mizumoto (Optical Infrared Astronomy Division, NAOJ) Junichi Noumaru, Toshiyuki Sasaki, Tadafumi Takata (Subaru Telescope, NAOJ) Masafumi Yagi (Optical Infrared Astronomy Division, NAOJ) Michitoshi Yoshida (Okayama Astrophysical Observatory, NAOJ)
The Subaru Telescope is the optical infrared telescope with 8.2m monolithic mirror located at the summit of Mauna Kea, Hawaii, USA, granted 100% by the Japanese government, Ministry of the Education, Culture, Sports, Science, and Technology. The Subaru Telescope began operation in January 1999, and opened the Open Use program for astronomers all over the world in December 2000. The Subaru Telescope Software Trinity System, which consists of Subaru Observation Software (SOSS), Subaru Telescope Archive System (STARS), and Data Analysis System Hierarchy (DASH) is supporting Subaru Data Flow, observation, archiving and analysis. Observation on the Subaru Telescope is operated by SOSS. In SOSS, telescope and instruments are defined as external modules and interface methods for sending commands and receiving status from those modules are defined. Quick analysis tools and utilities for preparation of the observation procedure are also implemented on SOSS. The Observation Dataset created during the observation procedure by SOSS defines the relation of various categories of FITS frames such as calibration frames, standard stars, and object frames. FITS frames are transferred to the Hilo Data Center and archived automatically to the tape library system. STARS is running to support online registration of observation data in close relation with SOSS, and retrieving with DASH as well as WWW interface for astronomers. As a first trial in the history of the Japanese astronomy, the Subaru Telescope began a challenging project to develop a new platform to support pipeline analysis of data taken by the Subaru Telescope. The DASH project based on the Object Oriented Method and CORBA began in 1996 and completed in March 2002. Thus the whole data obtained by the Subaru Telescope will be reusable for observers to prepare for the new observation proposal of the Subaru Telescope. Even during the observation, the Subaru Telescope Software Trinity System would be useful to find optimum parameters for observations to achieve the best quality. This is the way how the quality control on the observation data is realized on the Subaru Telescope with the Subaru Telescope Software Trinity system. The basic concept of supporting Subaru Data Flow by the Subaru Software Trinity, detailed software methodology we have chosen to develop, the status of the current operation, and the upgrade plan for the future will be presented.