[P1.1] Metadata for the VO: the Case of UCDs Sebastien Derriere, CDS - Observatoire de Strasbourg Francois Ochsenbein, CDS - Observatoire de Strasbourg Thomas Boch, CDS - Observatoire de Strasbourg The UCDs (Unified Content Descriptors) were first developed in the ESO/CDS data mining project, to describe precisely the contents of the individual fields (columns) of tables available from a data center. They have been used to describe the content of the $10^5$ columns available in the different VizieR tables. Owing to the wide diversity and high heterogeneity of table contents, UCDs constitute an excellent starting point for a hierarchical description of astronomy, for general data mining purposes. We present different applications of UCDs: selection of catalogues, based on their content; identification of catalogues having similar fields; automated data conversion allowing direct comparison of data in cross-identifications. The compatibility of UCDs with semantic descriptions developed in other contexts (data models for space-time coordinates or image datasets) will also be addressed. [P1.10] Digital Access to Aero- and Astrophotographic Archives J.-P. De Cuyper et al. Digital Access to Aero- and Astrophotographic Archives. The aim of this test-bed project, initiated and financed by the Belgian Federal Government, is to acquire, within the coming 4 years, the necessary know-how, hardware and software in order to preserve the historic scientific information contained in aero- and astrophotographic archives; to provide a user-friendly internet access to the catalogue and the (meta)data and to make the photographic information scientifically exploitable again through a high resolution digitisation technique. By constructing a high accuracy scanner XY airbearing table with laser interferometer steering, giving submicrometer absolute positioning accuracy, in order to attain the limiting positional accuracy determined by the internal plate errors. The archives concerned are the astrophotographic plate archive of the Royal Observatory of Belgium and the aerophotographic images of the National Geographic Institute and of the Royal Museum of Central Africa (Congo, Rwanda, Burundi). All contain photographs on glass plates as well as on film sheets. The technique of first making an analogue copy on roll film, allowing unattended all time scanning, will be studied in detail, as well as the photochemical treatment of fungi and deteriorations, in order to determine the introduced geometric and radiometric deformations . A digital catalogue is generated in a ODBC database that is to be distributed on intranet and internet (html files, ActiveX objects, C++ and Javascript programmation) [P1.11] Construction of the Japanese Virtual Observatory (JVO) Yoshihiko MIZUMOTO, NAOJ Masatoshi OHISHI, NAOJ Naoki YASUDA, NAOJ Yuji SHIRASAKI, NAOJ Masahiro TANAKA, NAOJ Yoshifumi MASUNAGA,Ochanomizu Univ. and NAOJ Ken MIURA, Fujitsu Ltd. Hirokuni MONZEN, Fujitsu Ltd. Kenji KAWARAI, Fujitsu Ltd. Yasuhide ISHIHARA, Fujitsu Ltd. Yasushi YAMAGUCHI, Fujitsu Ltd. Youji YANAKA, Fujitsu Ltd. The National Astronomical Observatory of Japan (NAOJ) has been operating several large astronomical facilities, such as the SUBARU telescope in Hawaii, the 45 m radio telescope and the Nobeyama Millimeter Array in Nobeyama, Japan, and plans to construct the Atacama Large Millimeter Array in Chile under close collaborations with astronomers in the US and the EU. Since January 2002, the NAOJ has been connected to the SuperSINET with 10 Gbps, and it has become possible to provide huge amount of observed multi-color data and analyses facilities to other astronomical institutions not only in Japan but in other countries through SuperSINET. Thus we have started the Japanese Virtual Observatory (JVO) project since April 2002 (see, http://jvo.nao.ac.jp/index-e.html). JVO utilizes the Grid technology to combine several remote computational facilities (observed databases in Hawaii and in several locations in Japan, servers for the data analyses, machines for data mining, etc.). We have completed to define the query language for the JVO, and have been designing on the deployment of JVO components (user interfaces, manager for virtual observations, execution modules to perform virtual observations, registries to resolve observation information, etc). We plan to construct a JVO-prototype by the end of 2002. [P1.12] The NOAO Science Archive, Version 2.0 R. Seaman N. Zarate T. Lauer P. Warner The NOAO Science Archive (NSA) is a step toward building a comprehensive scientific archive of the optical and infrared data holdings of the National Optical Astronomy Observatory. The goals for the NSA are to: - Rapidly create a scientifically useful archive of NOAO Survey data, - Develop in-house expertise in the relevant technologies, - Identify requirements for NOAO's future comprehensive archive, and - Create a high level of visibility as well as utility for both the NOAO Archive and NOAO Surveys, for example, through new Web services. The holdings of the NSA (http://archive.noao.edu) will be drawn from the NOAO Survey projects (http://www.noao.edu/gateway/surveys/programs.html) as well as from other coherent imaging or spectral, optical/IR reduced datasets that may be identified as candidates from NOAO or community facilities. Catalogs and other derived data products will be included in addition to images, spectra and the tools necessary to evaluate them. Synoptic, time-domain data is a special focus in anticipation of the needs of the LSST. The NSA team is working in coordination with other groups at NOAO who are focusing on data handling and data pipeline systems. Planning for the NSA was started in November of 2001 by the Science Data Systems Group of the NOAO Data Products Program. Version 1.0 of the NSA was released in early April, version 1.1 in July and version 1.2 in October of 2002. We discuss plans for Version 2.0 of NSA to be released in January of 2003. [P1.2] Russian and fSU Resources to be Integrated in the IVO Kilpio A., Dluzhnevskaya O., Kilpio E., Kovaleva D., Malkov O. The first collection of many Russian and fSU resources of astronomical data accumulated in Russian observatories and institutions has been compiled. As a first step we plan to provide a transparent access to these resources for scientific and educational purposes in the frame of the Russian Virtual Observatory (RVO) project. The collection of the astronomical resources is updated constantly. We evaluate the quality of the resources, in particular, by conducting the expert analysis. Another important goal of the RVO project is to elaborate the method of information representation based on well-known and accepted standards, as well as to provide the new ones. The Russian Centre for Astronomical Data (CAD) staff will carry out the activities on construction of the information hub of the Russian Virtual Observatory and on integration Russian and fSU resources in the International Virtual Observatory. CAD is one of the general purpose data centers for astronomy world-wide. CAD has been systematically collecting and distributing astronomical data for more than 20 years. [P1.3] The MAST Pointings Tables Project K.Levay, P. Padovani, R. Thompson, M. Donahue. M. Corbin We have undertaken a project to create a database of all HST imaging observations organized by position on the sky. A web-based interface to this database has also been created, in supplement to the existing interface to the HST archive. These "pointings tables" enable quick identification of overlapping fields that can be used for multiwavelength studies of objects, and for variability studies by comparing images at a given pointing over different epochs. They will also allow for "minisurveys" by providing lists of images over a specified coordinate range, such as above and below the Galactic plane. We plan to develop pointings tables for other MAST missions besides HST, and expect these tables to form an important part of upcoming virtual observatories. [P1.4] Towards an AVO Interoperability Prototype. Mark G. Allen, F. Genova, F. Ochsenbein, S. Derriere, C. Arviset, P. Didelon, S. Garrington, R. Mann, A. Micol, A. Richards, G. Rixon, A. Wicenec, M. Dolensky As part of the Astrophysical Virtual Observatory (AVO) we are developing an interoperability prototype which aims to federate a set of astronomical archives, representative of a variety of techniques (space/ground, images/spectra, X-ray to Radio wavelengths) into the CDS VizieR and Aladin tools. The target archives for federation are: VLT, NTT, EIS (ESO), HST/ECF, ISO, XMM (ESA), Wide field UK archives, MERLIN, and Terapix. We demonstrate the interoperability of these federated archives with science examples using multiwavelength image data and catalog overlays, and highlight new functionalities of the federation and integration tools. [P1.5] The AXAF (Chandra) Guide and Acquisition Star Catalog V1.5 (AGASC 1.5) Dennis Schmidt, SAO Paul Green, SAO Chandra's Aspect Camera Assembly (ACA) measures positions of selected stars to acquire and hold target pointings, and for post facto aspect determination. The selection and matching of the guide stars is governed by data in the AXAF (Chandra) Guide and Acquisition Star Catalog (AGASC). Based originally on version 1.1 of the Guide Star Catalog for the Hubble Space Telescope, the AGASC has been extended and refined in several stages, with data from additional catalogs and with recalibrations based on experience with Chandra. In 2002 the Chandra X-ray Center (CXC) completed a major upgrade of the AGASC. We merged data from three catalogs -- Tycho-2, GSC-ACT, and 2MASS. The Tycho-2 data substantially improve the photometric and astrometric measurements of stars as faint as V=12, while the GSC-ACT merge decreases by about half the systematic astrometric errors down to the catalog limit of about V=14.5. The 2MASS data identify galaxies down to J=12.5. These new catalog data enhance the value of the AGASC for scientific as well as operational purposes. Specifically for Chandra's use of the AGASC, we recalibrated the estimated ACA magnitudes based on Chandra on-orbit measurements, and implemented a more sophisticated calculation of the effect of nearby stars on the best-fit centroid of a guide star. In the process of the upgrade, we encountered and corrected a variety of errors introduced in earlier merges from published source catalogs. This paper presents the rationale for and the process of making each of the changes. It then discusses the improvements in performance that we expect to result. This project is supported by the Chandra X-ray Center under NASA contract NAS8-39073. [P1.6] SkyDOT: A Publicly Accessible Variability Database, Containing Multiple Sky Surveys and Real-Time Data. Dan Starr, LANL Przemek Wozniak, LANL W. Thomas Vestrand, LANL Robert White, LANL The Sky Database for Objects in Time-Domain (SkyDOT) is a virtual observatory which allows general access to several massive variability surveys. Although initially intended publicly release the data from Los Alamos's RAPTOR project, this database now includes data sets from both OGLE II and ROTSE I. SkyDOT's emphasis on time derivative data is enhanced by its update with real-time RAPTOR data. This enables user access to the most recent measurements of a given object, as well as its variability history. Our main task has been to construct an efficient relational database containing all existing data, while also handling a real-time influx of data. We provide useful web tools, which allow easy access to both astronomers and the public. In our implementation we employ a PostgreSQL database with a PHP based web interface. This server will initially allow common searches, specific queries, and access to light curves. In the future we will include: machine learning classification tools and access to spectral information. [P1.7] OASIS: A Data Fusion System Optimized for Access to Distributed Archives J.C. Good Mih-seh Kong G.B. Berriman The On-Line Archive Science Information Services (OASIS) client is accessible as a JAVA applet / JAR file through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. It provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). OASIS also includes a file-transfer manager that reports the status of multiple asynchronous data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis. [P1.8] A Web-based Tool for SDSS and 2MASS Database Searches Marci Hendrickson, Johns Hopkins University Alan Uomoto, Johns Hopkins University David Golimowski, Johns Hopkins University We have developed a website using HTML, PHP, Python, and MySQL that analyzes data from the Sloan Digital Sky Survey (SDSS) and the Two-Micron All-Sky Survey (2MASS). The goal of this project is to locate brown dwarf candidates in the SDSS database by looking at color cuts; however, this site could also be useful for other targeted searches of the two databases as well as being a prototype for easily implemented, customized searches of other large databases. The site uses MySQL databases created from broad searches of SDSS and 2MASS data, retrieving specified information. The broad queries on the SDSS and 2MASS database servers are run weekly so observers have the most up-to-date information from which to select their observational candidates. Observers can look at detailed information about specific objects, including finding charts, images, and, when available, spectra. In addition, updates from previous observations can be added by any collaborators; this format makes observational collaboration simple. Observers can also restrict the database search, just before or during an observing run, to select objects of special interest. [P1.9] Storage Options for Large VO Archives - The SDSS DR1 Experience Jan vandenBerg (JHU), Alex Szalay (JHU), Jim Gray (Microsoft BARC), Ani Thakar (JHU) We report on our experiences with evaluating storage hardware and firmware options for the SDSS Data Release 1 (DR1) archive. The unprecedented size of this archive presents daunting challenges with respect to storage of multiple datasets and providing high availibility and performance to a large user community. Not only are the datasets large and complex, but the need to maintain several versions simultaneously and ensure the I/O speeds necessary for efficient data mining means that we are always "pushing the envelope" in terms of the available storage hardware. This is not meant to be an exhaustive thesis on the available options and technology for large archive storage. However, since we have spent considerable time in evaluating and choosing the storage options for DR1, we felt that the knowledge and insights that we have gained in the process will be useful to other groups that plan to publish or mirror large astronomical archives for the VO community in the near future. [P2.1] ADS Web Services for the Discovery and Linking of Bibliographic Records Alberto Accomazzi, Harvard-Smithsonian Center for Astrophysics Guenther Eichhorn, Harvard-Smithsonian Center for Astrophysics Carolyn S. Grant, Harvard-Smithsonian Center for Astrophysics Michael J. Kurtz, Harvard-Smithsonian Center for Astrophysics Stephen S. Murray, Harvard-Smithsonian Center for Astrophysics The NASA Astrophysics Data System (ADS) currently provides free access to over 2.5 million records in four bibliographic databases through a sophisticated search interface. In addition to the basic metadata about a published paper, the ADS provides links to any relevant on-line resources, including full-text articles and published data tables. Similarly, an increasingly larger number of publishers and institutions are using the ADS to verify the existence and availability of references published in the scientific literature. To facilitate the exchange of metadata necessary to establish these links, the ADS is developing prototype Web Services based on emerging industry standards such as SOAP and WSDL as part of a collaboration with the major NASA Astrophysics Data Centers. Some examples illustrating the use of this technology in resource discovery, sharing and validation are presented and discussed. The ADS is funded by NASA Grant NCC5-189. [P2.2] Web Services in AIPS++ B. Waters, J. Benson, T. Cornwell The richness and transparency of the Glish distributed-computing model has traditionally obviated the need for standard networking components, such as the network classes found in Java, Python, or Perl. However, Glish is able to "wrap" arbitrary commands, enabling us to link powerful Java-based toolkits to Glish's event-based, client-server processing model. We have used this technique to implement a SOAP-based Cone Search web service for the Virtual Observatory. [P2.3] Turning Besan̤on Observatory on-line facilities into the VO - Galactic Model Simulation, Binary Star, Molecular Collisional and TNO data bases Bernard Debray, Besan̤on Observatory Marie-Lise Dubernet-Tuckey, LERMA, Paris Observatory and University of Franche-ComtÌ© Alain Grosjean, Besan̤on Observatory Edouard Oblak, Besan̤on Observatory Jean-Marc Petit, Besan̤on Observatory CÌ©line ReylÌ©, Besan̤on Observatory Annie Robin, Besan̤on Observatory For several years, the Besan̤on Observatory has been developing scientific facilities that are, or will be in the near future, accessible on-line through the World-Wide Web, namely : - the Model of stellar population synthesis of the Galaxy which produces simulations of the stellar content in various galactic directions, suitable for observation preparations and interpretations ; - the Double and Multiple Star data base for retrieval of data on all observational categories of binaries ; - the Molecular Collisional data base for ro-vibrational bibliographic data and H$_2$O+H$_2$ collisional excitation rates which is to be used for the Herschel and ALMA projects ; - a database of discovery and recovery observations of TNOs and an unbiased sample of well determined orbits of these objects, in the framework of the Observatory's involvment in the scheduling and real-time processing of observations for the Ultra-Wide Ecliptic component of the CFHT Legacy Survey. We describe how we envisage to turn these facilities as elements of the Virtual Observatory mesh, by using, as a first step, emerging standards such as VOTables and Unified Column Descriptors (UCD). [P2.4] Manuscript Preparation, Submission and Features of the Electronic IBVS Andras Holl, Konkoly Observatory, Budapest, Hungary IBVS is a small journal in the field of variable star research, which is fully electronic now. The HTML version of the journal features object database links and reference links. The necessary markup is provided by the authors, using the macros implemented in the LaTeX style file. We are testing a web-based manuscript submission tool, which would enable authors to submit data files, draw or upload simple figures, and enter plain ASCII or LaTeX text. The text is typeset on the server. The submitted manuscript can be pre-viewed, and links tested by the authors themselves. The markup has been designed to facilitate automatic information exchange between the journal and databases. A short description is given on the other features of the electronic IBVS. [P2.5] Web Services and their use in Starlink software Mark Taylor, Starlink Roy Platon, Starlink Alan Chipperfield, Starlink Peter Draper, Starlink David Giaretta, Starlink Web Services are gaining great popularity in the Grid community, and with good reason. The Starlink project is adopting Web Services as the method of interapplication communication. This is being done natively in new Java-based applications while older applications are being wrapped to provide Web Service interfaces. We are in this way providing interoperability between the generations of software in a heterogeneous, distributed manner thereby allowing the Starlink software to be useable in a distributed environment such as the GRID. [P2.6] A Collaborative Extension to the Solar Web Tool Romain Linsolas, IAS Isabelle F. Scholl, IAS Eric Legay, IAS The number of archives of solar observations is continuously growing and the location of their storage is more and more scattered. Consequently, the number of tools or websites existing for consulting these observation catalogs (i.e. metadata) is equally in augmentation. The interest of a single program capable of accessing distributed and heterogeneous archives is therefore obvious. The Solar Web Project, developed by the MEDOC IAS team, is clearly designed to come up to these expectations. It is a first step toward a Virtual Solar Observatory. The architecture of the current version of Solar Web (which will be available soon at http://www.medoc-ias.u-psud.fr/archive/solar_web) is based on a 3-tier model. All clients are connected to a single server which provides them with results by querying all accessible databases. This centralized view can have a significant impact on security, performance and flexibility of policy management. The main evolution of Solar Web consists in moving to a distributed architecture. In this purpose, two solutions are currently considered. The first solution is to create collaborative servers, where all instances of the Solar Web server are networked together, and make a collaborative work in the sense that a query sent by a client can be distributed to all servers. The second solution is built on top of peer-to-peer networks technology. It consists in redesigning the network level by using JXTA (Sun's peer-to-peer technology) infrastructure. This solution can easily provide new features such as the dynamic creation of groups of users based on their field of interest. In this paper, we will present concepts for both solutions with their advantages and disavantages. [P2.8] The ADASS XII Meeting Web Site Carolyn Liou, STScI/UofMd Steve Hulbert, STScI We present the architecture, design, and implementation details of the ADASS XII web site. The web site was implemented in Zope, a high-performance application server, web server, and content management system rolled into one. Zope includes a robust, scalable object database, web services architecture, and powerful programming capabilities. The web site was built to conform to HTML, CSS, and accessibility standands as adopted by the W3C. This dynamic web site also taps into a back-end Sybase database while requiring a minimal amount of coding. We offer this site as a prototype web site suitable for reuse in suppporting future ADASS meetings. This site was created by the Information Services Team of the Computing and Information Services Division of Space Telescope Science Institute. [P3.1] Synchronous Observations of Ground Based Optical and X-ray Space Born Telescopes Alexei Pozanenko, IKI Sergei Bondar', State Technical Research Center-Kosmoten Grigorii Beskin, SAO Marat Gilfanov, IKI/MPA Vasilij Rumyantsev, CrAO Simultaneous multiwavelength observations are critically important for understanding physical and astronomical properties of many celestial phenomena. We consider simultaneous X- ray/optical observations of two types of objects that are of particular importance for high energy astrophysics: cosmic gamma-ray bursts (GRB) and low mass x-ray binaries (LMXB). An important task of optical transients observation requires continuous wide field telescope surveys. Based on available observations we discuss criteria of development optical wide- field camera and present the current status of the automatic telescope that is being developed at IKI for the purpose of simultaneous optical observations of GRBs counterparts. The instrument will have the FOV = 15 deg, and limiting magnitude 10.5 at 0.1 s exposure. The flow of successive frames will be stored and compared with catalog for on-line identification of optical transients. The post analysis of accumulated frames increase limiting magnitude of the system up to 14 magnitude. Subsequent cross-analysis with x-ray telescopes improves probability of transient identification. Simultaneous observation with X-ray camera at HETE-2 spacecraft would result in few cases per year of simultaneous observation of GRB error box. We also discuss details of synchronous optical/X-ray observations of LMXBs with high time resolution using the 6m telescope at SAO and Rossi X-ray Timing Explorer. We report preliminary results on the optical observations with sub- millisecond time resolution of Sco X-1 and Cyg X-2 during the 2002. The observations were performed on the 6-meter telescope of Special Astrophysical Observatory (SAO) of with Multichannel Panoramic Photometer-Polarimeter in the primary focus and MANIA registration system with time resolution of 2 microsec. [P3.10] Chandra Ray Tracer (ChaRT) A Web Interface to Chandra PSF Simulations Clayton Carter, Harvard-Smithsonian Center for Astrophysics Margarita Karoska, Harvard-Smithsonian Center for Astrophysics Diab Jerius, Harvard-Smithsonian Center for Astrophysics Ken Glotfelty, Harvard-Smithsonian Center for Astrophysics Steve Beikman, Harvard-Smithsonian Center for Astrophysics Calculation of point spread functions (PSF) for the Chandra High Resolution Mirror Assembly (HRMA) and detectors is an important part of the analysis of Chandra data. Since the HRMA PSFs are highly variable, no analytical model exists and each PSF must be simulated individually. Existing methods for doing so \cite{PSFLIB} have a limiting accuracy; while the most direct and accurate method \cite{SAOSAC} is subject to a steep learning curve, prodigious computing requirements and implementations that are proprietary in nature. With the introduction of Chandra Ray Tracer (ChaRT), we hope to address all of these complications. ChaRT is a web application by which users can easily generate SAOsac simulations of the HRMA PSF. Users are allowed to specify multiple source locations, each characterized by either a monochromatic energy or an input spectrum. ChaRT then verifies and submits these sources for SAOsac to simulate, notifying the user when their data are available for download. This is accomplished using scripts, developed at the Chandra X-ray Center (CXC), that work in conjunction with some commonly available software components \cite{PBS} on a distributed computer network at the CXC. Additionally, the system was designed to be extensible and ChaRT's infrastructure may be used as a model for future projects experiencing the same complications as Chandra's PSF calculations. In this paper we will review the software components and hardware architecture used to implement this system, and we will address possible future work. This project is supported by the Chandra X-ray Center under NASA contract NAS8-39073. [P3.11] Calibration of BIMA Data in AIPS++ Daniel Goscha, NCSA David Mehringer, NCSA Raymond Plante, NCSA Anuj Sarma, University of Illinois We summarize the general approach to calibration of millimeter interferometer data from the BIMA telescope using AIPS++ and illustrate the use of the relevant software tools. In particular, we will discuss flagging, phase calibration, flux calibration, and polarization calibration, and we will show how we take advantage of the unique capabilities of AIPS++ to meet the special needs of BIMA data. We will show how BIMA calibration tools can be used to hide some of the complexity of the processes while still allowing access to specialized variations if desired. We will illustrate how these tools are pipelined together for end-to-end processing both within the BIMA Image Pipeline and on the user's desktop. Finally, we will present some a comparison of data calibrated in MIRIAD and AIPS++. [P3.12] Status of the BIMA Imaging Pipeline David M. Mehringer Raymond L. Plante We report on the current status of the metadata-driven BIMA Imaging Pipeline. At the time of abstract submission, we are nearing production mode in which we will prodcue first order images of target sources as well as plots of images and calibration solutions. All these products will be ingested into the BIMA Data Archive where they will be available to users. [P3.13] AIPS++ Reduction and Analysis of GBT Single-Dish Spectral Data James Braatz, NRAO Joseph McMullin, NRAO Robert Garwood, NRAO Athol Kemball, NRAO The Green Bank Telescope (GBT) is a new 100-m diameter antenna with an unblocked aperture and an active surface. It is designed to observe at frequencies from 300 MHz to 100 GHz, and includes state of the art continuum and spectral backends. AIPS++ is the integral software package for analysis of GBT data both for scientific analysis as well as for control and engineering analysis of the component systems. We will give an overview of how the AIPS++ system is used in processing spectral line data. AIPS++ allows a layered approach to software development whose usefulness is highlighted by the spectral analysis capabilities. At the heart of AIPS++ is a suite of tools which are capable of astronomy-specific calculations as well as general purpose mathematical analysis, data visualization, GUI development, and scripting. A tool for analyzing single-dish data, DISH, is developed on this platform. DISH includes a number of modern features such as bulk processing of datasets and versatile GUI interaction. A simplified CLI interface designed to work with scan-based data is a recent addition to DISH, and was originally built as a thin layer on the DISH core structure. The Interim Automated Reduction and Display System (IARDS) is built as another layer on DISH and provides the run-time display of GBT spectra. [P3.14] Spectral Extraction using aXe Norbert Pirzkal, ST-ECF Anna Pasquali, ST-ECF Richard Hook, ST-ECF Jeremy Walsh, ST-ECF Rudi Albrecht, ST-ECF New large format spectroscopic instruments have become available to the astronomical community. These produce images containing large numbers of spectra and can be very time consuming to analyze. One of these is the Advanced Camera for Surveys (ACS), which has recently been installed on HST. The ACS provides for both grism and prism slitless spectroscopy and these modes can result in data sets containing hundreds of spectra on the large format detectors. A method of easily extracting the information from these data and quickly producing spectra of individual objects is highly desirable. To this end we have developed a new extraction software package called aXe which was specifically designed to handle the ACS spectrophotometric data. Using a pair of direct/grism or direct/prism images, aXe can extract tilted spectra, estimate and subtract the local background, wavelength calibrate, flat-field, and flux calibrate them. Due to the increasing number of instruments on modern telescopes capable of performing spectrophotometric observations similar to those of ACS, aXe was designed to also be used with non-ACS data. In this poster, we describe aXe and its use with ACS grism data, as well as with grism and short slit spectroscopic data from the ESO VLT FORS instrument. [P3.15] CIR: A New Package for Interactive Data Reduction of ISOCAM Data Rene D. Gastaud DAPNIA/SIDE, CEA/SACLAY Pierre Chanial DAPNIA/SAP, CEA/SACLAY There is already a good tool for interactive data reduction of ISOCAM data: CIA (Ref.1), based upon IDL. We describe here a new package of IDL routines, which kept all the main functionnalities of the previous one, but has been completely re-drawn, using our 8 years of experience with ISOCAM data reduction, and some new features of IDL. Architecture is simplified, routines are re-written. The package is slimmed down from 1000 to less than 200 routines and is quicker ( up to twice). The code is easier to understand, to maintain and to evolve. It has been used for 2 years at SAP Saclay. Sources with HTML documentation and examples are freely available. This tool has also been used to simulate some Herschel/PACS data. This tool can help to quickly test new algorithms and data structure for new IR cameras (SIRTIF). References 1) ADASS X 2000 CIA v 5.0 The legacy Package for ISOCAM Interactive Analysis [P3.16] New Features of SAOImage DS9 William Joye, Smithsonian Astrophysical Observatory SAOImage DS9 is an astronomical imaging and data visualization application. DS9 supports FITS images and binary tables, multiple frame buffers, region manipulation, and many scale algorithms and colormaps. It provides for easy communication with external analysis tasks and is highly configurable and extensible. A number of new important features have been developed for DS9. They include: -- Support for the Virtual Observatory, which allows users to view and analyze remotely-located data from their local site. -- Improved support for external analysis, which allows users to integrate their own analysis tasks into DS9. -- Fits Binary Table 3D binning, which allows users to create a 3D Fits Data cube and view the data as an interactive movie. -- New projection, panda, and compass regions. Of most interest is, the interactive projection region, which displays an arbitrary cut of the image data, projected along a line. -- New built-in help facility. The reference manual and FAQ documentation are available on all platforms, and no longer require the use of a web browser and network access. -- Full support for Fits Multiple WCS. Images may rotated and aligned, and coordinate grids displayed using any available WCS (including equatorial and linear). Acknowledgments This work was performed in large part under a grant from NASA's Applied Information System Research Program (NAG5-3996), with support from the (Chandra) High Resolution Camera (NAS8-38248) and the Chandra X-ray Science Center (NAS8-39073). [P3.17] Migrating Astronomical Software Systems from Tcl/Tk to Java Alberto Maurizio Chavan, ESO Tim Canavan, ESO Dario Dorigo, ESO Nick Kornweibel, ESO Fabio Sogni, ESO ESO began developing its Phase I proposal management system in 1994, while development of the Phase II tools began two years later. This first generation of the tools was developed using Tcl/Tk. In 1998 it was decided to migrate all tools to Java, and that activity is now nearing completion. This paper describes the rationale behind the decision to migrate, the migration process itself, and the lessons we learned during these years. We will not attempt to compare two very different programming languages: we'll try instead to describe the challenges and risks we faced in migrating medium-sized, mission-critical systems from one language to the other. [P3.18] New IRAF Messaging Applications Francisco Valdes, NOAO Michael Fitzpatrick, NOAO Robert Seaman, NOAO New examples of IRAF applications interacting using a low-volume, socket-based text messaging scheme are described and demonstrated. The tasks may be distributed across multiple CPUs or locations, using a many-to-one or one-to-many client-server architecture. Server applications respond to messages without blocking so other activities such as data processing, user interaction through a GUI, or responding to another client application, may take place. Messaging is based on a simple text-based scheme consisting of either commands begining with a colon or data in keyword/value pairs. This also allows any non-IRAF application which understands the protocol to participate as either a new client or server application. The commands have the same form as IRAF GUI commands so that applications may easily interact with GUI tasks. One such server application demonstrated is a GUI IRAF processing monitor. IRAF (or non-IRAF) data reduction tasks send status and processing information to GUI server tasks which provide graphical displays and interaction with the received information. One motivation for this is the powerful and easily customizable nature of IRAF GUIs by means of the GUI description files (which are currently TCL interpreted modules), and the need for such a component in the NOAO Mosaic Data Product Pipeline. This pipeline uses multiple IRAF data reduction tasks and data parallel processing distributed across a network, requiring a central monitoring facility to ensure proper operation of the pipeline. Other applications of this messaging scheme are also discussed. [P3.2] An Interactive Java Plotting Package for Astronomy Anzhen Zhang, IPAC/Caltech John Good, IPAC/Caltech Bruce Berriman, IPAC/Caltech (Infrared Processsing etc) This paper describes the architecture and functionality of QtPlot, a general purpose 2-dimensional plotting package for astronomy. It is a modification of an Open Source Java Plotting package, PtPlot, version 5.1p1, made available by the Ptolemy project at the University of California. QtPlot is on operation at the Infrared Science Archive (IRSA), where it supports interactive plotting of spectra from the Submillimeter Wave Astronomical Satellite (SWAS), and light curves from the American Association of Variable Star Observers (AAVSO). It has also been integrated into OASIS, IRSA's data fusion toolkit. QtPlot displays local files and remote files through HTTP protocols. It supports ASCII table files and XML files, which have been structured for astronomical plot directives. QtPlot has a rich suite of user-controlled functions, for modifying plot appearance (symbol, color etc), plot boundaries, and annotation. Finally, QtPlot has a panning and zooming feature. [P3.3] SAS, the Scientific Analysis System of the XMM-Newton Observatory Carlos Gabriel, ESA / VILSPA Matteo Guainazzi, ESA / VILSPA Fred Jansen, ESA / ESTEC Uwe Lammers, ESA / ESTEC Giuseppe Vacanti, ESA / ESTEC XMM-Newton, the most sensitive X-ray satellite ever built, is succesfully operating since January 2000. It is providing the scientific astronomical community with the deepest X-ray images ever along more than two decades in energy (0.1-15 keV), as well as with high-resolution spectra (resolving power in the range 200-800) in the soft X-rays (0.5-2 keV). Simultaneous optical and UV coverage is ensured by the Optical Monitor on board. The Scientific Analysis System (SAS) is a state-of-the-art interactive analysis package for the calibration and analysis of all the XMM-Newton data. It represents a combined effort of more than 30 scientific institutes in the world, coordinated by the Science Operation Center (Villafranca del Castillo, Spain) and the Science Survey Center (Leicester, UK). Reduced and calibrated scientific products, directly usable for scientific analysis, are produced running SAS in a semi-automatic fashion by the SSC and distributed to the community. A large and sophisticated part of the SAS is dedicated to make multidimensional data analysis easy, efficient and user friendly. This maximizes the exploitation of the basic four dimensional information (RA, DEC, energy and time) the XMM-Newton instruments are obtaining from each collected X-ray photon. Special emphasis will be put on the high level quality control performed both on the software components as on the distributed data products. The large capabilities of the SAS will be demonstrated going over examples of scientific results achieved. [P3.4] XAssist: A System for the Automation of X-ray Astrophysics Analysis Andrew Ptak, JHU Richard Griffiths, CMU XAssist is a NASA AISR funded project for the automation of X-ray astrophysics, with emphasis on galaxies. It is nearing completion of its initially funded effort, and is working well for Chandra and ROSAT data. By the fall of 2002 ASCA processing should be well supported as well as initial support for XMM-Newton data. It is capable of data reprocessing, source detection, and preliminary spatial, temporal and spectral analysis for each source with sufficient counts. We intend XAssist to eventually become part of the NVO, and non-interactive access to tables at HEASARC is already implemented. [P3.5] Solving for Polarization Leakage in Radio Interferometers using Unpolarized Sources Sanjay Bhatnagar, NRAO-Socorro,USA/NCRA-Pune,India R.V. Urvashi, BITS-Pilani,India/NCRA-Pune,India R. Nityanada, NCRA-Pune,India Polarization leakage in the antennas of a radio interferometer can occur due to mechanical imperfections or dipole mis-alignment or due to imperfect electronics. These leakages manifest themselves as closure errors in co-polar visibility measurements of unpolarized sources. For many working radio telescopes, these leakage amplitudes range from a few percent to as much as ten percent. Further, many telescopes offer lower integration times and/or larger number of frequency channels across the RF band for co-polar observations. Consequently, significant fraction of observations are done in co-polar mode. Computation of antenna based leakage gains using co-polar visibilities is therefore scientifically useful as well as valuable for debugging and calibrating the instrument. This paper presents an algorithm for solving antenna based leakage gains in a radio interferometer using co-polar observations of unpolarized sources. Complex antenna gains and leakage gains, modeled as additive terms, are solved for simultaneously. An additional transformation of the solutions which maximizes the power in the antenna gains, then separates the leakage gains from the usual antenna gains. The algorithm is robust in the presence of RFI or otherwise corrupted data and was extensively tested with simulations and with controlled experiments with the Giant Meterwave Radio Telescope (GMRT). Degeneracy in the solutions due to the use of unpolarized sources in also discussed. Interpretation of the leakage gains on the Poincare sphere and the connection between the leakage induced closure phase and the Pancharatnam phase of optics is also discussed. [P3.6] Generalized Self-Calibration for Space VLBI Image Reconstruction Sergey F. Likhachev Generalized self-calibration (GSC) algorithm as a solution of a non-linear optimization problem is considered. The algorithm allows to work easily with the first and the second derivatives of a visibility function phases and amplitudes. This approach is important for high orbiting Space VLBI data processing. The implementation of the GSC-algorithm for radio astronomy image restoration is shown. The comparison with other self-calibration algorithms is demonstrated. The GSC- algorithm was implemented in the radio astronomy imaging software project Astro Space Locator (ASL) for Windows developed at the Astro Space Center. [P3.7] FUSE Flat-Field Calibration using Wavelets Paul Barrett, Space Telescope Sci. Inst. Alex Fullerton, Johns Hopkins University This paper describes an investigation into using Wavelets to characterize and then to create FUSE flat-field calibration files. The basic approach is to transform the FUSE ground flat data, which has a relatively high number of counts per pixel (~20) and uniform illumination, into the Flight Aligned Reference Frame (FARF). We apply wavelet techniques to enhance detector features in the ground and in-flight data by filtering and de-noising in the frequency and spatial domains. Small subarrays of the enhanced ground and in-flight flats are then cross-correlated over the entire image to determine the transformation matrix or the local offset in pixels between the two sets of data. [P3.8] Projecting 3-D Simulations Into Pseudo Observations Alex Antunes, GMU John Wallin, GMU We present methods for converting particle method three-dimensional simulations into observationally verifiable projected column densities, channel maps, fluxes, and velocity contours. Such projections are suitable for direct comparison with radio data (such as produced by AIPS), X-ray observations (e.g. ximage), optical, and IR. Whereas modeling usually involves NBody, mesh, SPH, or LPR calculations upon an idealized 3-D space, our observational data is always limited to a single line of sight projection, observing only one 'plane' of the object, emission from which may or may not include extinction. For models to have any validity, we must be able to generate pseudo-observational data from the model, to compare with actual observations. This connects our modeling with the real universe we see; herein we discuss the methods for creating such projections. [P3.9] PacketLib: a C++ Library for Satellite Telemetry Oriented Applications Andrea Bulgarelli, CNR/IASF Bologna Fulvio Gianotti, CNR/IASF Bologna Massimo Trifoglio, CNR/IASF Bologna PacketLib is a C++ open-source software library for writing applications which deal with satellite telemetry source packets, provided that the packets are compliant with the ESA Telemetry and Telecommand Standards. The library is being used in the context of the space mission AGILE of Italian Space Agency (ASI) for simulation, graphical display, processing and decoding of the telemetry generated by the Test Equipment of two AGILE detectors and by the AGILE Payload EGSE. From an input stream of bytes, the library is able to recognize automatically the source packets (described by a simple configuration file), and provides a simple access to each packet field by means of an object oriented interface. In the same way the library writes source packets to output stream. Various types of input and output streams are abstracted by a software layer. This paper presents the architecture of the library and some example of applications developed with it. [P4.1] Pointing Refinement of SIRTF Images Frank Masci David Makovoz David Shupe Mehrdad Moshir John Fowler The soon-to-be-launched Space Infrared Telescope Facility (SIRTF) shall produce image data with an a-posteriori pointing knowledege of 1.4" (1 sigma radial) with a goal of 1.2" in the ICRS coordinate frame. In order to perform robust image coaddition, mosaic generation, extraction and position determination of sources to faint levels, the pointing will need to be refined to better than few-tenths of an arcsecond. Input to the position refinement software are point sources extracted from a mosaic of overlapping images. The software will use this information to find a "global minimization" of all relative offsets amongst all overlapping images. This is a novel method utilizing a generic linear sparse matrix solver. The pointings and orientations of SIRTF images can be refined in either a "relative" sense where pointings become fixed relative to a single image of a mosaic, or, in an "absolute" sense (in the celestial frame) if absolute point source information is known. Our goal is to produce science products with sub-arcsecond pointing accuracy. [P4.10] Automated Object Classification with ClassX Anatoly Suchkov, STScI Tom McGlynn, NASA/GSFC Eric Winter, NASA/GSFC Lorellla Angelini, NASA/GSFC Michael Corcoran, NASA/GSFC Sebastien Derriere, NASA/GSFC Megan Donahue STScI Stephen Drake, NASA/GSFC Pierre Fernique, CDS Francoise Genova, CDS R.J. Hanisch, STScI Francois Ochsenbein, CDS W.D. Pence, NASA/GSFC Marc Postman, STScI Nicolas White, NASA/GSFC Richard White, STScI We report preliminary results from the ClassX project. ClassX is aimed at creating an automated system to classify unclassified X-ray sources and is envisaged as a prototype of the Virtual Observatory. The ClassX team has used machine learning methods to generate, or `train' classifiers from a variety of `training' data sets, each representing a particular sample of known objects that have measured X-ray fluxes complemented, wherever possible, with data from other wavelength bands. Specifically in this paper, a classifier is represented by a set of oblique decision trees (DT) induced by a DT generation system {\it OC1}. We integrate different classifiers into a network, in which each classifier can make its own class assignment for an unclassified X-ray source from a classifier-specific list of class names (object types). An X-ray source is input into a classifier as a set of X-ray fluxes and possibly other parameters, including data in the optical, infrared, radio, etc. In the network, each classifier is optimized for handling different tasks and/or different object types. Therefore, having a set of unclassified X-ray sources, a user would generally select a certain classifier to make, for instance, a most {\it complete} list of candidate QSOs, but a different classifier would be used to make a most {\it reliable} list of candidate QSOs. Still different classifiers would be selected to make similar lists for other object types. Along with the straightforward class name assignment, a network classifier outputs the probability for a source to belong to the assigned class as well as probabilities that the source belongs, in fact, to other classes in the given class name list. We illustrate the current capabilities of ClassX and the emerging concept of a classifiers network(s) with the results obtained with classifiers trained on data from ROSAT (the WGA catalog), complemented with data from the Guide Star Catalog (GSC2) and the 2-Micron All-Sky Survey. [P4.11] Genetic Programming and Other Fitting Techniques in Galactic Dynamics Peter Teuben, University of Maryland Fitting is the bread and butter of astronomy. Non-linear fitting, especially large scale ones, present themselves with many problems. Genetic programming is one such solution. In this poster I will present some galaxy dynamics fitting techniques, in particular those of velocity fields, and apply new techniques such as genetic programming. [P4.12] FLY: A Tree Code towards the Adaptive Mesh Refinement U. Becciani, INAF - Astrophysical Observatory of Catania V. Antonuccio-Delogu, INAF - Astrophysical Observatory of Catania We have developed a powerful N-body code to evolve three-dimensional self-gravitating collisionless systems with a large number of particles ($ N \geq 10^7$). FLY ({\bf F}ast {\bf L}evel-based N-bod{\bf Y} code) is a fully parallel code based on a tree algorithm. It adopts periodic boundary conditions implemented by means of the Ewald summation technique. FLY is based on the one-side communication paradigm to share data among the processors that access remote private data avoiding any kind of synchronism. The code was originally developed on a CRAY T3E system using the logically SHared MEMory access routines ({\it SHMEM}) and it was ported on SGI ORIGIN systems and on IBM SP, on the latter making use of the Low-Level Application Programming Interface routines ({\it LAPI}). FLY is based on four main characteristics: it adopts a simple domain decomposition, a grouping strategy, a dynamic load balancement mechanism without significant overhead, and a data buffering that allows us to minimize data communication. It is an open source free code and more details are available at http://www.ct.astro.it/fly/. This paper show an example of integration of a tree code with an adaptive gride scheme. PARAMESH is a package of Fortran 90 subroutines using {\it SHMEM} and {\it MPI} libraries, designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with an adaptive mesh refinement (AMR). The computational domain is hierarchically subdivided into sub-blocks following a 3D tree data-structure. The use of {\it SHMEM} and the tree data-structure allow an easy integration with FLY that adopts the same data structure and the same parallel communication library . This implementation of FLY with PARAMESH makes available the output of FLY integrated with an adaptive grid, having the same data structure of PARAMESH. The adaptive grid structure can be read by FLY or generated from it, and contains the potential field of each data block of the grid, following the PARAMESH scheme.\\ Moreover, the same procedure of FLY will be also available like an external procedure that will allow to create a PARAMESH grid from any data point distribution in a cubic region, e.g. a cosmological dark matter distribution. This new implementation will allow the FLY output, and more generally any binary output, to be used with any hydrodynamics code, that adopt PARAMESH data structure, to study compressible flow problems. [P4.13] Classification using Labeled and Unlabeled Data David Bazell, Eureka Scientific, Inc. David Miller, Penn State University Kirk Borne, Raytheon We will discuss several novel approaches to the exploration, understanding, and classification of astronomical data. We are exploring the use of unlabeled data for supervised classification and for semi-supervised clustering. Current automated classification methods rely heavily on supervised learning algorithms that require training data sets containing large amounts of previously classified, or labeled, data. While unlabeled data is often cheap and plentiful, using a human to classify the data is tedious, time consuming, and expensive. We are examining methods whereby supervised classification techniques can use cheaply available, large volumes of unlabeled data to substantially improve their ability to classify objects. We are also exploring a unified framework where learned models provide clustering or classification solutions, or both, depending on the needs of the user. [P4.14] Predictive Mining of Time Series Data in Astronomy Eric Perlman Akshay Java Joint Center for Astrophysics, UMBC We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build redictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets representing several classes of objects from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We discuss possible applications, for example in maximizing the return for scheduling either survey or target of opportunity observations. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. [P4.15] NIRCAM Image Simulations for NGST Wavefront Sensing Russell B. Makidon, Anand Sivaramakrishnan, Donald F. Figer, Robert I. Jedrzejewski, Howard A. Bushouse, John E. Krist, H. S. Stockman, Philip E. Hodge, Nadezhda M. Dencheva, Bernard J. Rauscher, Victoria G. Laidler, Catherine Ohara, David C. Redding, Myungshin Im, and Joel D. Offenberg The Next Generation Space Telescope (NGST) will be a segmented, deployable, infrared-optimized 6.5m space telescope. Its active primary segments will be aligned, co-phased, and then fine-tuned in order to deliver image quality sufficient for the telescope's intended scientific goals. Wavefront sensing used to drive this tuning will come from the analysis of focussed and defocussed images taken with its near-IR science camera, NIRCAM. There is a pressing need to verify that this will be possible with the near-IR detectors that are still under development for NGST. We create simulated NIRCAM images to test the maintenance phase of this plan. Our simulations incorporate Poisson and electronics read noise, and are designed to be able to include various detector and electronics non-linearities. We present our first such simulation, using known properties of HAWAII HgCdTe focal plane array detectors. Detector effects characterized by the Independent Detector Testing Laboratory are included as they become available. Simulating InSb detectors can also be done within this framework in future. We generate Point-Spread Functions (PSF's) for a segmented aperture geometry with various wavefront aberrations, and convolve this with typical galaxy backgrounds and stellar foregrounds. We then simulate up-the-ramp (MULTIACCUM in HST parlance) exposures with cosmic ray hits. We pass these images through the HST NICMOS `CALNICA' calibration task to filter out cosmic ray hits. The final images are to be fed to wavefront sensing software, in order to find the ranges of exposure times, filter bandpass, defocus, and calibration star magnitude required to keep the NGST image within its specifications. [P4.16] Integrating Statistical Tools with Databases Adrian Pope (JHU) Tamas Budavari (JHU) Alex S. Szalay (JHU) Istvan Szapudi (Univ of Hawaii) Andrew J. Connolly (Univ of Pittsburgh) With the advent of large astronomical surveys the way scientific calculations are done is changing. Dedicated telescopes are collecting incredible amounts of information about the Universe that are stored in databases. We describe a method to do angular clustering analysis of 50 million galaxies in the Science Archive of the Sloan Digital Sky Survey. Using a Web Service attached to the database, we stream only the relevant coordinates into the correlation function code called eSpICE that computes the angular correlation function directly. The pipeline is set up to take a query that selects the galaxies by different ranges of absolute luminosity and spectral type and returns the angular clustering results. The processing time is remarkable; it is several orders of magnitude faster than traditional implementations of the optimal two-point estimator. [P4.2] A Theoretical Photometric and Astrometric Performance Model for Point Spread Function CCD Stellar Photometry Kenneth J. Mighell, National Optical Astronomy Observatory Using a simple two-dimensional Gaussian Point Spread Function (PSF) on a constant (flat) sky background, I derive a theoretical photometric and astrometric performance model for analytical and digital PSF-fitting stellar photometry. The theoretical model makes excellent predictions for the photometric and astrometric performance of over-sampled and under-sampled CCD stellar observations even with cameras with pixels that have large {\em{intra}}-pixel quantum efficiency variations. The performance model is demonstrated to accurately predict the photometric and astrometric performance of realistic space-based observations from segmented-mirror telescope concepts like the Next Generation Space Telescope with the MATPHOT algorithm for digital PSF CCD stellar photometry which I presented last year at ADASS XI. The key PSF-based parameter of the theoretical performance model is the effective background area which is defined to be the reciprocal of the volume integral of the \underline{square} of the (normalized) PSF; a critically-sampled PSF has an effective background area of $4\pi$ ($\approx 12.57$) pixels. A bright star with a million photons can theoretically simultaneously achieve a signal-to-noise ratio of 1000 with a (relative) astrometric error of a {\em{milli}}pixel. The photometric performance is maximized when either the effective background area or the effective-background-level measurement error is minimized. Real-world considerations, like the use of poor CCD flat fields to calibrate the observations, can and do cause many existing space-based and ground-based CCD imagers to fail to live up to their theoretical performance limits. Future optical and infrared imaging instruments can be designed and operated to avoid the limitations of some existing space-based and ground-based cameras. This work is supported by grants from the Office of Space Science of the National Aeronautics and Space Administration (NASA). [P4.3] Adaptive Optics Software on the CfAO Web Page Andreas Quirrenbach, UCSD Vesa Junkkarinen, UCSD Rainer Koehler, UCSD Several software packages are publicly available on the web site of the Center for Adaptive Optics (CfAO). These packages support the development of adaptive optics systems, and the analysis of data obtained with adaptive optics. We will discuss the structure and support of the web site, and give an overview of the capabilities of the individual available packages. [P4.4] Image Reduction Pipeline for the Detection of Variable Sources in Highly Crowded Fields Claus A. Goessl Arno Riffeser We present a reduction pipeline for CCD (charge-coupled device) images which was built to search for variable sources in highly crowded fields like the M 31 bulge and to handle extensive databases due to large time series. We describe all steps of the standard reduction in detail with emphasis on the realisation of per pixel error propagation: Bias correction, treatment of bad pixels, flatfielding, and filtering of cosmic rays. The problems of conservation of PSF (point spread function) and error propagation in our image alignment procedure as well as the detection algorithm for variable sources are discussed: We build difference images via image convolution with a technique called OIS (optimal image subtraction, Alard & Lupton 1998), proceed with an automatic detection of variable sources in noise dominated images and finally apply a PSF-fitting, relative photometry to the sources found. For the WeCAPP project (Riffeser et al. 2001) we achieve 3-sigma detections for variable sources with an apparent brightness of e.g. m = 24.9 mag at their minimum and a variation of dm = 2.4 mag (or m = 21.9 mag brightness minimum and a variation of dm = 0.6 mag) on a background signal of 18.1 mag / arcsec^2 based on a 500 s exposure with 1.5 arcsec seeing at a 1.2 m telescope. The complete per pixel error propagation allows us to give accurate errors for each measurement. [P4.5] Representations of Spectral Coordinates in FITS Eric W. Greisen National Radio Astronomy Observatory, Socorro, NM Francisco G. Valdes National Optical Astronomy Observatory, Tucson, AZ Mark R. Calabretta Australia Telescope National Facility, Epping, NSW Steven L. Allen UCO/Lick Observatory, Santa Cruz, CA In Paper I, Greisen & Calabretta (2002) describe a generalized method for specifying the coordinates of FITS data samples. Following that general method, Calabretta & Greisen (2002) in Paper II describe detailed conventions for defining celestial coordinates as they are projected onto a two-dimensional plane. The present paper extends the discussion to the spectral coordinates of wavelength, frequency, and velocity. World coordinate functions are defined for spectral axes sampled evenly in wavelength, frequency, or velocity, evenly in the logarithm of wavelength or frequency, as projected by ideal dispersing elements, and as specified by a lookup table. Papers I and II have been accepted into the FITS standard by at least the North American FITS Committee; we expect the present work to be accepted as well. [P4.6] Image Compression using CFITSIO William Pence, NASA/GSFC The CFITSIO subroutine library now transparently supports reading and writing of FITS images in a new tile-compressed image format. The image is divided into a grid of rectangular tiles and then each tile of pixels is individually compressed (using a choice of different algorithms) and is stored in a variable length array column in a FITS binary table. The advantages of using this format are a) the header keywords remain uncompressed for fast access, and b) it is possible to extract sub-images without having to uncompress the entire original image, because only the tiles that contain pixels in the subimage have to be uncompressed. This image format also supports a lossy compression technique that is very effective for floating point data type images by throwing away the noise bits without sacrificing any scientifically useful information. This paper will demonstrate the effectiveness of this image compression technique on a number of different FITS images that were all extracted from existing public data archives. [P4.7] Restoration of Digitized Astronomical Plates with the Pixon Method P.R. Hiltner, R. Nestler, P. Kroll Applications of the Pixon restoration method to digitized plates of the Sonneberg Plate Archive - the world's 2nd largest - are reported. Results so far obtained show that the severe astigmatism/coma distortion present in the outer parts of these wide field images can almost completely be removed. Also, object definition (FWHM) of point sources and S/N improve by factors of 2 to 7, depending on the object's strength and location (background etc.). We discuss consequences for the automated astronomical processing of the restored plates, which are of crucial importance for the inclusion of digitized archives in the virtual observatory context. [P4.8] sso_freeze: De-smearing Solar System Objects in Chandra Observations Roger Hain, Harvard-Smithsonian Center for Astrophysics Jonathan McDowell, Harvard-Smithsonian Center for Astrophysics Arnold Rots, Harvard-Smithsonian Center for Astrophysics K. J. Glotfelty, Harvard-Smithsonian Center for Astrophysics Observations from the Chandra X-Ray Observatory are made in a fixed inertial coordinate frame. Most objects observed with Chandra, such as supernova remnants, quasars, or pulsars, are at infinity for all practical purposes and the observations produce sharp, focused images. However, the motion of objects observed within the solar system, such as planets or comets, will cause the object's image to appear blurred when viewed in a fixed inertial frame. This effect is similar to the blur which would be seen if a fixed camera were to take a photograph of a fast moving car. To reconstruct the image, the CXC CIAO tool sso_freeze corrects for this effect. An origin is chosen at the center of the object, and moves along with the object as it moves with respect to inertial space. The positions of the source photons are then recalculated with respect to this moving origin. The image formed from the recalculated photons now shows a clear object, such as a disk for a planet. As an effect of this processing, fixed X-ray sources become smeared in the image. The effect is similar to moving the camera to follow the fast moving car in the earlier example. The car becomes clearly focused, and the scene around the car is blurred. Images which demonstrate the effect of sso_freeze are shown for Jupiter and Comet C/1999 S4 Linear. This project is supported by the Chandra Xray Center under NASA contract NAS8-39073. [P4.9] Merging of Spectral Orders from Fiber Echelle Spectrographs Petr Skoda, Astronomical Institute of the Academy of Sciences of the Czech Republic Herman Hensberge, Royal Observatory of Belgium We have reviewed the data reduction of two fiber-based echelle spectrographs (HEROS and FEROS) with emphasis on similarities of the inconsistencies between the overlap of spectral orders before merging. The literature on echelle data reduction shows that such inconsistencies are commonly observed (and usually handled by rather heuristic procedures - mostly interactively). For both instruments, it seems to be the calibration unit which introduces through the flat fielding the major part of the problems. We discuss strategies to treat the problems and to remove the inconsistencies before merging the spectral orders with a minimum of interactive, subjective algorithms. [P5.1] Optimizing the Performance of ISO and XMM Data Archives Jose Hernandez, ESA Christophe Arviset,ESA John Dowson,ESA Pedro Osuna, ESA Aurele Venet, ESA Optimizing the performance of ISO and XMM Data Archives The ISO Data Archive and the XMM-Newton Science Archive have been developed in Java using a multi-tier Architecture, the archives are accessed through a Java Applet running inside any standard browser, they have been used from more than 10 different platforms and thousands of users. Here we present the experience acquainted in the last 4 years and a few techniques we have used to dramatically improve the performance of the application, as well as the main difficulties we have found along the way. In particular will touch upon the following points: -Techniques used to optimize the performance of the application: -Session management, Objects caching -Minimize number of client-server transactions -Data compression, optimize the object Serialization -How to maintain the interface configurable -When do we stop optimizing: Problems/Bugs introduced during the optimization -Techniques used to optimize the size of the applet: -obfuscation -building a thin client -jar loading on demand [P5.2] ANDES--NOAO's Observatory Database System David Gasson, NOAO Dave Bell, NOAO Mia Hartman, NOAO ANDES (the Advanced NOAO Database Expert System) is NOAO's new observatory database system. Recent improvements include the phase out of legacy components, such as our previous Access-based effort called ALPS++. New work focuses on post-TAC procedures such as scheduling, collection of observing reports on the mountain, automatic compilation of various statistics, and publication tracking as ways to extend the usefulness of ANDES. The ultimate goal is to provide an environment which allows a comprehensive understanding of the collection, evaluation, scheduling, observing and post-observing (including publications) of proposals and programs. [P5.3] Chandra Data Archive Download and Usage Database Emily Blecksmith, Stephane Paltani, Arnold Rots, Sherry Winkelman CfA/CXC In order to support regular operations, the Chandra Data Archive Operations Group has developed a database that records and monitors the user activities that affect the archive servers. This database provides information on the number of users that are connected at a given time, what archive interfaces they use (we have several), and how much data are being downloaded. The database consists of three tables, populated by a set of three scripts that parse the archive server logs, the ftp logs and the login logs. User activity can be tracked through each of those logs, making information from a given connection easily accessible. With this tool, the Archive Group will be able to gather statistics and monitor trends, which in the future will improve the accessibility of Chandra data. This work is supported by NASA contract NAS 8-39073 (CXC). [P5.4] A Simple Rule Based Query Service for Querying Complex Databases Niall Gaffney Lisa Gardner Molly Brandt Moving towards a Virtual Observatory model for archive services has demonstrated that more generic query services will be needed. For many advanced query services that exist today, one must understand the structure of data source to query it - often going as far as making the user hand craft SQL. From the beginning of its JAVA incarnation, StarView has separated the task of getting qualification information from the user from formatting of the SQL. We have recently replaced a very cumbersome and Space Telescope database structure specific LISP program with one based on three simple database tables and a Perl script to build queries for our database. StarView (or any other program) need only submit the equivalent of the SQL SELECT and the qualifiable portion of the WHERE clauses (with no database specific join information). The service takes this information and, for our SQL driven database, adds in all the needed SELECT, FROM, and complete WHERE clauses that is a valid query. With this tool, any program can query the STScI archive database without prior knowledge of how the database tables relate. Only the fields/keywords that are to be queried and returned are specified. Further as new tables and databases are added, the service can be simply updated with the join rules for the new tables without any changes to the software system. Our poster will outline how the system works in the general case using the STScI databases as an example of a preexisting and complex data source can be simply modeled. We will demonstrate how this system could work for any relational database to be queried by software without knowing its schema and explore possible functionality for such a generic query service in the context of the currently rapidly evolving Virtual Observatory. [P5.5] Performing Dynamic Overlaps Between Astronomical Archives using HTM Gretchen Green, Alex Szalay, Antonio Volpicelli, Karen Levay, Paolo Padovani, George Fekete, Wil O‰ÛªMullane An experimental system recently developed provides a method for dynamic computation of overlapping regions between astronomical archives using the Hierarchical Triangulated Mesh (HTM) spatial index. The test bed for this system is based on the SDSS Science Archive and the HST Pointing Catalog. Region boundaries described with HTM polygons provide a common frame for performing joins between the large-scale SDSS survey and the sparse coverage of HST. We expect to provide Web services for performing these intersections to facilitate the navigation of these multi-instrument observations. These methods are valuable in demonstrating the value of HTM standard region specifications for the space-time definitions of astronomical datasets that can be used for NVO discovery technology. [P5.6] An AIPS++-based archive system for radio telescopes John Benson, Boyd Waters, and Tim Cornwell (NRAO) The NRAO End-to-End (e2e) project has the goal of providing automated, streamlined handling of radio observations on NRAO telescopes all the way from proposal submission to archive access. As an early part of this project, we have constructed an archive system for NRAO telescopes using mainly tools available in the Astronomical Information Processing System (AIPS++). Since the tools are available to anyone using AIPS++, this amounts to a generic archive capability for any telescope for which the AIPS++ data conversion program exists. The rich tool set available in AIPS++ has enabled very rapid development: our entire effort took no more than about 1 FTE-year. Additional capabilities were required to connect AIPS++ to the web. The system is now being deployed at the NRAO as a prototype archive system for the Very Large Array with deployment for the Green Bank Telescope and Very Long Baseline Array planned for 2003. [P5.7] The XMM-Newton SSC Database: Taking Advantage of a Full Object Data Model. Laurent Michel, Observatoire Astronomique de Strasbourg (FR). Christian Motch, Observatoire Astronomique de Strasbourg (FR). Clive Page, Leicester University (UK). Mike Watson, Leicester University (UK). One of the main reponsibility of the Science Survey Consortium (SSC) of the XMM-Newton satellite is to carry out a systematic analysis of all instrument data. These high quality products are shiped to the guest observer and eventually enter the XMM-Newton archive. In addition, the SSC compiles a catalogue of X-ray sources and provides identification for the ~ 50,000 new sources detected every year. In order to checkproduct quality and support the catalogue and source identification programmes, all SSC generated products are stored into a data base developped for that purpose. The database is a powerful tool to browse and evaluate XMM-Newton data and to perform various kinds of scientific analysis. It provides online data views including relevant links between products and correlated entries extracted from many archival catalogues available at CDS and NED. Because of the large number of transversal links, our data model is difficult to map into relational tables. It has been therefore designed with object oriented technology for both user's interface and data repository (OODBMS O2). Besides browsing, the Web based user interface provides facilities to select data collections with any constraints on any keywords but also with constraints on correlated data patterns. [P5.8] Representations of DEIMOS Data Structures in FITS Steven L. Allen, UCO/Lick Observatory De A. Clarke, UCO/Lick Observatory Robert I. Kibrick, UCO/Lick Observatory DEIMOS (the DEep Imaging Multi-Object Spectrograph) began producing scientific data from the Keck II telescope in 2002 June. The instrument is extremely configurable, and the form of the output data is highly variable. Filters and gratings may be swapped, gratings and mirrors tilt, readout modes and active amplifiers of the 8-CCD mosaic change, and numerous field-specific astrometric slitmasks may be inserted. For archival purposes and to enable fully-automated data reduction, FITS files from DEIMOS document the instrument state, all aspects of the slitmask design, and multiple world coordinate systems for the mosaic images. The FITS files are compatible with existing local conventions for mosaic image display systems and also with incipient FITS WCS standards. [P6.1] CCD meets Geodetic Astronomy: The Digital Zenith Camera, a Geodetic State-of-the-art Instrument for Automatic Geographic Positioning in Real-Time Dipl.-Ing. Christian Hirt, Institut fuer Erdmessung, University of Hanover, Germany The determination of positions on the earth's surface by using stars as celestial reference used to be the main task of practical astronomy in the past. Whereas the basic principle remained unvaried throughout the centuries, observation techniques changed - from purely visual to photographic methods. Today, the availability of CCD sensors lead to completely digital and automatic methods for the astronomical determination of geographic coordinates. Combined with GPS, these methods are applied in geodesy for the determination of vertical deflections and hence of the earth's gravity field. In this presentation, a digital zenith camera developed at the Institut f? ¼r Erdmessung, University of Hanover, is introduced as mighty geodetic state-of-the-art instrumentation for astronomical position determination. Using CCD technology for imaging zenithal stars, a GPS equipment for high-precision time measurement and the new powerful data processing system AURIGA (Automatic Real-time Image Processing System For Geodetic Astronomy), this configuration allows the determination of the geographic coordinates longitude and latitude based on a fully automatic procedure in real-time up to an accuracy of at least 0.2 arcseconds. The design and performance of the digital zenith camera are depicted. Main emphasis is laid on data aquisition and data processing by using AURIGA. Besides a description of astrometric algorithms for image data reduction, the applicability of precise star catalogues Tycho-2, GSC, UCAC and A2.0 as reference for geographical position determination is discussed. Besides the high-precision determination of the earth's gravity field as main application area of the digital zenith camera, future applications like monitoring of near zenith atmospheric effects (systematic refraction and scintillation) – are indicated. [P6.10] Chandra Monitoring, Trends, and Response Scott J, Wolk Bradley Spitzbart Takashi Isobe The Chandra X-Ray Observatory was launched in July, 1999 and has yielded extraordinary scientific results. Monitoring and Trends Analysis (M&TA) system has proven to be a valuable resource. With three years worth of on-orbit data, we have available a vast array of both telescope diagnostic information and analysis of scientific data to access Observatory performance. The primary goal of M&TA is to provide tools for effective decision making leading to the most efficient production of quality science output from the Observatory. M&TA analyzes the most crucial parameters in realtime through C++ tools linked to perl scripts that provide e-mail and pager alerts in the case of limit violations or unexpected spacecraft states. In addition, the telemetry stream is formatted for easy access through a web browser or wireless device. More comprehensive monitoring alerts and trending plots are generated on daily and mission length timelines. The system utilizes input from the raw telemetry stream, telemetry dumps, processed science products and the archival databases. Output is given in the form of e-mail and/or pager alerts when necessary. Long term monitoring and trending is presented on the world wide web, and all data are archived in SQL and RDB databases. This variety allows for the easy creation of custom studies, showing, for instance, the relation between space weather and co-dependent current rates or charge transfer inefficiency degradation as a function of radiation fluence. This work is supported by NASA contract NAS8-39073. [P6.11] The SCUBA2 Data Acquisition System Xiaofeng Gao, UK ATC Dennis Kelly, UK ATC Michael MacIntosh, UK ATC William Duncan, UK ATC Trevor Hodson, UK ATC Damian Audley, UK ATC Wayne Holland, UK ATC SCUBA2 is a Second generation sub-millimeter imager for the JCMT currently being designed by a consortium led by the UK ATC. We describe the SCUBA2 data acquisition system, which consists of multiplexed digitization controlled by FPGAs connected by optical fiber to a network of PCs running RTLinux. [P6.2] The Wonderful Worlds of the ITE Leslie Zimmerman Foor, Space Telescope Science Institute The Integrated Test Environment (ITE) is a test environment designed by the Software Testing Team (STT) to simulate the HST planning and scheduling system from proposal development through the generation of the Science Mission Specification (SMS). More than just a mirror of the operational environment, the ITE allows users the flexibility of choosing the versions of the software they would like to use. Various configurations can be used simultaneously by multiple users on any number of machines. In fact, one user can have multiple configurations at the same time on one computer. In addition to the testing flexibilty, it also gives developers the ability to investigate operational problems by using operational data with operational versions of the software in their own environment without the risk of interfering with the daily operational work. [P6.3] A Generic Process Coordinator, Developed For The Planck-Surveyor Mission Wolfgang Hovest Frank Dannemann Thomas Riller Matthias Bartelmann Planck is the third medium-sized mission of ESA's Horizon-2000 scientific programme. Starting in 2007, Planck will obtain full-sky maps in nine frequency bands in the microwave regime between 30 and 857 GHz. The primary goal of Planck is to map the Cosmic Microwave Background (CMB) with unprecedented resolution and sensitivity. The accurately measured angular power spectrum of the CMB will allow the precise determination of all relevant cosmological parameters. Planck will also test the inflationary model of the early universe. For the purpose of data reduction of the huge amount of data Planck will obtain and for simulation-purposes a process coordinator (ProC) is being developed. The ProC will allow to build up pipelines from single modules (written in FORTRAN, C, C++, or Java) and execute it in a distributed heterogenous network in an automatic fashion. Besides the main results all parameters, all provisional results of the single modules of the pipeline and any information about the processing itself will be traced and saved in an oo-database. The current state of this Process Coordinator will be detailed. [P6.4] Extending ORAC-DR to Multiple Observatories Brad Cavanagh, Joint Astronomy Centre Paul Hirst, Joint Astronomy Centre Malcolm J. Currie, Joint Astronomy Centre Tim Jenness, Joint Astronomy Centre Frossie Economou, Joint Astronomy Centre Stuart Ryder, Anglo-Australian Observatory Stephen P. Todd, Edinburgh University / UK Astronomy Technology Centre ORAC-DR, a flexible and extensible data reduction pipeline, has been successfully used for on-line data reduction for data from UFTI and IRCAM (infrared cameras), CGS4 (near-infrared spectrometer), MICHELLE (mid-infrared Echelle spectrometer), all at UKIRT, and SCUBA (sub-millimetre bolometer array) at JCMT. We have now added the infrared imager and spectrometer IRIS2 at the Anglo-Australian Telescope and the infrared imaging spectrometer UIST at UKIRT to the list of officially supported instruments. We also present initial support for the multi-object spectrograph GMOS, the near-infrared imager NIRI, and MICHELLE at Gemini. This paper briefly describes features of the pipeline, along with details of adopting ORAC-DR for other instruments on telescopes around the world. [P6.5] SIRTF Web Based Tools for QA and Instrument Performance Monitoring Bob Narron, Irene Bregman, John White The SIRTF Science Center is developing two Web based tools which will be used during operations. One tool is for Quality Analysis. It will allow the analysts to display images and plots of new data and then to record status and comments in the central database. The other tool is for display of Instrument Performance Monitoring data. It provides an easy to use way for the science staff to create plots and ASCII files of this data. Both tools use Java applets to display images and plots Perl for everything else. The standard Perl DBI interface is used to access the database [P6.6] Monitoring the Chandra X-ray Observatory via the Wireless Internet Bradley D. Spitzbart, Scott J. Wolk The Chandra X-ray Observatory, launched in July 1999, continues to provide unprecedented high energy astrophysical discoveries with efficiency and reliability. From time to time, though, urgent operational decisions must be made by engineers, instrument teams, and scientists, often on short notice and at odd hours. There are several real-time, mostly Internet-based data resources available to aid in the decision-making discussions when a crisis arises. In addition, Chandra's Science Operations Team has been experimenting with emerging Wireless Application Protocol (WAP) technologies to create yet another pathway for data flow. Our WAP Internet pages provide anytime, anywhere access to critical spacecraft information through cellular phones or other WAP-enabled devices. We currently offer several dynamic web pages including a live telemetry stream, information on the radiation environment, the real-time contact schedule, the week's observing schedule, and contact information for key personnel. The protocol even allows CGI or other server-side executable code, which is implemented here to facilitate users' queries for past data sets. There are, of course, many challenges in attempting to present useful, meaningful content on a 5 X 12 character screen over limited bandwidth in a way that is user-friendly and beneficial. This paper will discuss our experience with this developing, promising new medium, design strategies, and future enhancements. This work is supported by NASA contract NAS8-39073. [P6.7] A Java-based Calibrator Search Tool for Radio Astronomy Honglin Ye (NRAO), John Benson (NRAO) Astronomers using radio synthesis arrays often need to search for calibrator sources to be used during observations. The desired properties of the source depend upon the context: for example, sometimes a bright, somewhat resolved source close to the target is preferable to a more compact source further away. The desired brightness of the calibrator depends upon things like the array configuration, and atmospheric coherence time. For these reasons, human interaction is often the easiest and best way to select calibrators. With this in mind, we have designed a Java-based tool for selecting calibrators based upon information gathered by NRAO scientists from various information sources. This tool is based around a display of the target region of the sky, with known calibrator sources displayed accompanied by user-selectable annotation as to brightness and positional accuracy. Further information about selected sources is given in tabular and graphical forms. The light curve, spectra behavior, resolution curve, and a typical image can all be shown if desired. In connection with the development of this tool, we have merged several information sources into one database. This enables the tool to be used for VLA and VLBA observations. The tool is currently being tested by NRAO scientists and will be made available to NRAO users later this year. The development of the tool is part of the NRAO End-to-End (e2e) project, which has the goal of providing automated, streamlined handling of radio observations on NRAO telescopes all the way from proposal submission to archive access. Further development of the calibrator search tool will focus on integration into the emerging overall toolset of the e2e project. Thus, for example, the calibrator search tool will ultimately be integrated with the tool for preparing an observation script. [P6.8] COSMOS-3: The 3rd Generation Telescope Control Software System of Nobeyama Radio Observatory Koh-Ichiro Morita, National Astronomical Observatory Japan (NAOJ) Naomasa Nakai, NAOJ Masatoshi Ohishi, NAOJ Toshikazu Takahashi, NAOJ Kazuhiko Miyazawa, NAOJ Takashi Tsutsumi, CfA Shigehisa Takakuwa, ASIAA Hiroyuki Ohta, Fujitsu Kiyohiko Yanagisawa, FNS The Nobeyama 45 m telescope and Nobeyama Millimeter array at Nobeyama Radio Observatory has been operated since 1982. The control system for these telescopes has evolved from a centralized architecture (COSMOS-1) based on IBM compatible mainframe to current hierarchical distributed system (COSMOS-3) running on a distributed environment of WS's and PC's. COSMOS-3 is functionally divided into three levels. Tools at top level provide various user interfaces for various observing requirements. There are Supervisor, Merger, and Qlook at middle level. Supervisor and Merger control message/data flow between upper level and bottom level. Qlook shows current observing results. At bottom level, there are many Local Controllers to communicate with each device. Important design concepts of COSMOS-3 are, - Communication interfaces between different levels should be simple as possible. - No direct connection between different Local Controller's. - The system provides a wrapping mechanism for control programs written by non-expert software engineers in Local Controllers. Because of these concepts, it is very easy and quick to add new functions or new devices to the system. [P6.9] Remote Observing on the Keck Telescopes Patrick L. Shopbell (Caltech) Robert Kibrick (UCO/Lick Observatory) We present a summary of ongoing efforts to use the Keck telescopes remotely from the U.S. mainland. This work has been spearheaded by one of us (RK) at UC Santa Cruz, but is now expanding to include remote sites at Caltech and UC San Diego. Additional sites are planned for the future. In this paper we describe the remote observing architecture, including network reliability issues, data replication methods, and interface aspects, such as videoconferencing. We provide quantitative and qualitative analyses of the Caltech remote observing runs thus far, from which we derive a number of lessons and suggestions for improving remote observing with large telescopes. [P7.1] Data Calibration Pipeline for the Far Ultraviolet Spectroscopic Explorer William V. Dixon, Johns Hopkins David J. Sahnow, Johns Hopkins The FUSE SDP Group CalFUSE is the calibration software pipeline used at the Johns Hopkins University to process data from FUSE, the Far Ultraviolet Spectroscopic Explorer. The pipeline corrects for a variety of instrumental effects, extracts target spectra, and applies the appropriate wavelength and flux calibrations. The software is written in C and runs under the Solaris, DEC Alpha, and Linux operating systems. In this poster, we present recent improvements in the pipeline, including a new module to correct for the effects of spacecraft motion during an observation, and announce the availability of calibrated spectral files, created using CalFUSE v2.1.0, from MAST, the Multimission Archive at STScI. [P7.10] The COBRA/CARMA Correlator Data Processing System Steve Scott, Caltech/OVRO Rick Hobbs, Caltech/OVRO Andy Beard, Caltech/OVRO Paul Daniel, Caltech/OVRO Colby Kraybill, University of California Berkeley Mel Wright, University of California Berkeley Erik Leitch, University of Chicago David Mehringer, University of Illinois Ray Plante, University of Illinois N. S. Amarnath, University of Maryland Marc Pound, University of Maryland Kevin Rauch, University of Maryland Peter Teuben, University of Maryland The COBRA (Caltech Owens Valley Broadband Reprogrammable Array) correlator is an FPGA based spectrometer with 16 MHz resolution and 4 GHz total bandwidth that will be commissioned on the Caltech Millimeter Array in September, 2002. The processing system described here includes collection of correlation function and total power data from the underlying software systems and then the synchronization and processing that are done to produce calibrated visibilities. The processing steps include passband gain correction, system temperature and flux scaling, blanking and flagging, atmospheric delay correction, and apodization and decimation. CORBA is used to move data from the 5 hardware based computers to the pipeline computer. Within a computer, computational steps are implemented as separate processes using shared memory for communication. At any step along the pipeline the data may be graphically inspected remotely using a CORBA based tool, the CARMA Data Viewer. This same architecture will be applied to the CARMA wideband correlator (8 stations, 8 GHz bandwidth) and the CARMA spectral correlator (15 stations, 4 GHz bandwidth) scheduled for 2003 and 2004. See accompanying posters by Plante et al and Pound et al. [P7.11] CARMA Data Storage, Archiving, Pipeline Processing, and the Quest for a Data Format Ray Plante, NCSA/University of Illinois Marc Pound, University of Maryland David Mehringer, NCSA/University of Illinois Steve Scott, Caltech/OVRO Andy Beard, Caltech/OVRO Paul Daniel, Caltech/OVRO Rick Hobbs, Caltech/OVRO Colby Kraybill, University of California Berkeley Mel Wright, University of California Berkeley Erik Leitch, University of Chicago N. S. Amarnath, University of Maryland Kevin Rauch, University of Maryland Peter Teuben, University of Maryland In 2005, the BIMA and OVRO mm-wave interferometers will be merged into a new array, the Combined Array for Research in Millimeter-wave Astronomy (CARMA). Each existing array has its own visibility data format, storage facility, and tradition of data analysis software. The choice for CARMA was to use one of a number of an existing formats or devise a format that combined the best of each. Furthermore, it had to address three important considerations. First, the CARMA data format must satisfy the sometimes orthogonal needs of both astronomers and engineers. Second, forcing all users to adopt a single off-line reduction package is not practical; thus, multiple end-user formats are necessary. Finally, CARMA is on a strict schedule to first light; thus, any solution must meet the restrictions of an accelerated software development cycle and take advantage of code reuse as much as possible. We describe our solution in which the pipelined data passes through two forms: a low-level database-based format oriented toward engineers and a high-level dataset-based form oriented toward scientists. The BIMA Data Archive at NCSA has been operating in production mode for a decade and will be reused for CARMA with enhanced search capabilities. The integrated BIMA Image Pipeline developed at NCSA will be used to produced calibrated visibility data and images for end-users. We describe the data flow from the CARMA telescope correlator to delivery to astronomers over the web and show current examples of pipeline-processed images of BIMA observations. [P7.12] CARMA Software Development Marc Pound, University of Maryland N. S. Amarnath, University of Maryland Kevin Rauch, University of Maryland Peter Teuben, University of Maryland Colby Kraybill, University of California Berkeley Mel Wright, University of California Berkeley Andy Beard, Caltech/OVRO Paul Daniel, Caltech/OVRO Rick Hobbs, Caltech/OVRO Steve Scott, Caltech/OVRO Erik Leitch, University of Chicago David Mehringer, University of Illinois Ray Plante, University of Illinois Combining the existing BIMA and OVRO mm interferometers, and adding a new third sub-array in a combined CARMA mm interferometer will not only bring up new challenges in hardware, but also in software. Both arrays have their own mature operations software, developed over the last decade. For CARMA, the situation is not as simple as choosing one over the other. It is further complicated by the fact that the software developers are dispersed among 5 institutions and 4 time zones. Such multi-institution development requires frequent communication, local oversight, and reliable code management tools. Timeline has forced us to carefully balance using existing software, with wrappers bound to a new, more object oriented approach, and rewriting from scratch. New hardware, such as the correlator, has already resulted in new software, but we also anticipate re-using a fair fraction of the existing telescope software. This poster will summarize our ideas on how we plan to do this, as well as outline what we call the CARMA Software Toolkit and the associated Software Engineering aspects. (see also accompanying posters by Scott et al. and Plante et al) [P7.13] Refactoring DIRT Amarnath, N. S., Pound, M. W., & Wolfire, M. G. (University of Maryland) The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS, located at http://dustem.astro.umd.edu) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysical community for the past 4 years. DIRT uses results from a number of numerical models of astrophysical processes, and has an awt based user interface. DIRT has been refactored to decouple data representation from plotting and curve fitting. This makes it easier to a) add new kinds of astrophysical models b) use the plotter in other applications c) migrate the user interface to Swing components d) modify the user interface to add functionality (for example, SIRTF tools). DIRT is now an extension of two generic libraries, one of which manages data representation and caching, and the second of which manages plotting and curve fitting. This project is an example of refactoring with no impact on user interface, so the existing user community was not affected. [P7.14] COS Calibration Process Stephane Beland, Steven Penton, Erik Wilkinson COS has two distinct ultraviolet channels covering the spectral range from 1150ÌÉ to 3200ÌÉ. The NUV channel covers the range from 1700ÌÉ to 3200ÌÉ and uses the Hubble Space Telescope‰Ûªs STIS spare MAMA. The FUV channel uses a micro channel plate detector with a cross-delay line readout system to cover the range from 1150ÌÉ to 1900ÌÉ. Due to the analog nature of the readout electronics of the FUV detector, this system is sensitive to temperature variations and has non-uniform pixel size across its sensitive area. We present a step-by-step description of the calibration process required to transform raw data from the COS into fully corrected and calibrated spectra ready for scientific analysis. Initial simulated raw COS data is used to demonstrate the calibration process. [P7.15] Systems Integration Testing of OPUS and the New DADS Lisa E. Sherbert and Lauretta Nagel The Data Archive and Distribution System (DADS) will be entering the IDR (Ingest Distribution Redesign) era soon and more major functions will be shifting from the VMS platforms to various Unix platforms. As the first phase, Distribution, is delivered to testing, interfaces with OPUS and OTFR (On The Fly Reprocessing) will change. We will give an overview of the OPUS/DADS/OTFR supersystem, circa Fall 2002, and identify interface changes that will impact the operators and archive users. [P7.16] Supporting The Observatory Mission-Critical Data Flow Benoit Pirenne, European Southern Observatory ESO's model for operating the VLT (and several other telescopes) was developed following the space missions model: a well defined, regularly spaced and repeating set of cycles comprising phase-I and phase-II community proposal submission steps, scheduling (long and mid-term), observing, archival, quality control and finally distribution of the observations to the PIs. The above steps are almost all taking place at the headquarter of the Observatory for cost and logistics reasons. This modus operandi is the most logical one for a "service mode"-oriented observatory. In this contribution, the VLT operations' three year experience is given with particular emphasis on how the headquarters operations management support structure developed and stabilized. A number of metrics to assess the performance of the support operation is provided. [P7.2] The Next Step for the FUSE Calibration Pipeline David J. Sahnow, Johns Hopkins University William V. Dixon, Johns Hopkins University The FUSE SDP Group, Johns Hopkins University The calibration pipeline for the Far Ultraviolet Spectroscopic Explorer (FUSE) was designed years before the satellite was launched in June of 1999. After launch, a number of unexpected instrumental features were discovered; as the FUSE team dealt with each of them, the pipeline was modified appropriately. Eventually, these changes made the design so cumbersome that the pipeline became difficult to maintain. In 2002, we began to develop a new pipeline concept that takes into account the actual instrument characteristics. We will present our plans for this improved calibration pipeline and describe the progress we have made toward that goal. In addition, we will discuss the lessons learned while modifying the original design. [P7.3] Building a Middle Tier for the CXC Data Archive Alexandra Patz, Peter Harbo, John Moran, David Van Stone, Panagoula Zografou The Chandra Data Archive at the Chandra X-ray Center is developing a middle tier that can be utilized by both the current J2EE web application (WebChaSeR) and the Java Swing application (ChaSeR) to provide a uniform interface to the archive. This middle tier consists of a collection of independent services, from authenticating users to returning data such as an observation image or a proposal abstract. The services are accessible through an HTTP interface, allowing ChaSeR, WebChaSeR or any other HTTP client to access them. The services are run on an application server, implemented in Java using Apache's open-source tool, Struts. Having a central interface to the archive, shared by all client applications, will allow for code reusability and easier maintenance. This poster discusses the design of the middle tier. This project is supported by the Chandra X-ray Center under NASA contract NAS8-39073. [P7.4] ClassX: A VOTABLE-Enabled X-ray Correlation and Classification Pipeline Eric Winter (a,e), Thomas McGlynn (a,d), Anatoly Suchkov (b), Robert Hanisch (b), Lorella Angelini (a,d), Michael F. Corcoran (a,d), Sebastien Derriere (c), Megan Donahue (b), Stephen Drake (a,d), Pierre Fernique (c), Francoise Genova (c), Francois Ochsenbein (c), William Pence (a), Marc Postman (b), Nicholas White (a), Richard White (b) (a) High Energy Astrophysics Science Archive Research Center, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (b) Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD, 21218 (c) Centre de Donnees astronomiques de Strasbourg, Obervatoire de Strasbourg, 67000 Strasbourg, France (d) Universities Space Research Association, 7501 Forbes Boulevard, Seabrook, MD, 20706 (e) Science Systems and Applications, Inc., 10210 Greenbelt Road, Suite 600, Lanham, Maryland, 20706 The ClassX project aims to provide a Web service to classify unclassified astronomical X-ray sources. This objective requires collecting and assimilating data from a wide variety of sources. These data sources differ in both syntax and semantics, and therefore must be translated to a common format to be useful in the classification process. The ClassX pipeline addresses this problem by using the VOTABLE XML DTD (http://us-vo.org/xml/VOTable.dtd) to to store and manipulate data from multiple remote sources. An extensive Perl API for the VOTABLE format was developed during the project, and has been released for use by the NVO community. The lessons learned during the development of the ClassX pipeline provide significant experience in identifying and addressing similar problems that will be encountered during the development of the National Virtual Observatory. [P7.5] The Automated Data Processing Pipelines for SIRTF IRS Fan Fang, Clare Waterson, Jing Li, Bob Narron, Iffat Khan, Wen P. Lee, John Fowler, Russ Laher, Mehrdad Moshir We present the design, structure, and implementation of the automated data processing pipelines for the Infrared Spectrograph (IRS) onboard Space Infrared Telescope Facility (SIRTF). This includes science data reduction pipelines that generate Basic Calibrated Data (BCD) and enhanced science (Post-BCD) products, and calibration pipelines generating calibration data that allows reduction of the science data. [P7.6] Self-calibration for the SIRTF GOODS Legacy Project David Grumm, STScI Stefano Casertano, STScI Data analysis for the SIRTF GOODS Legacy Project must be able to achieve a level of calibration noise well below a part in 10,000. To achieve such a high level of fidelity, a form of self-calibration is required in which the sky intensity and the instrumental effects are derived simultaneously. Two methods being investigated are a least squares approach based on the work of Fixsen and Arendt at GSFC, and an iterative method. Both methods have been applied to derive the sky, flat field, and offset from simulated data for instruments to be flown on SIRTF; the results will be discussed. [P7.7] Calibration of COS data at STScI Philip Hodge, Space Telescope Science Institute This paper describes the program for pipeline calibration of Cosmic Origins Spectrograph (COS) data at the Space Telescope Science Institute. CALCOS is written in Python. Image and table data are read from and written to FITS files using PyFITS, and the data arrays are manipulated using the numarray module, with C extension code for special cases not included in numarray. [P7.8] An Automatic Image Registration and Coaddition Pipeline for the Advanced Camera for Surveys. John P. Blakeslee Kenneth Anderson Daniel Magee Gerhardt R. Meurer We have written an automatic image processing pipeline for the Advanced Camera for Surveys (ACS) Guaranteed Time Observation program. The pipeline supports the different cameras available on the ACS instrument and is written in the Python programming language using a flexible object oriented design that simplifies the incorporation of new pipeline modules. It also makes use of the PyFits and Pyraf packages distributed by STScI, as well as other external software. The processing steps include empirical determination of image offsets and rotation, cosmic ray rejection, and image combination using the drizzle software, as well as the production of object catalogs and XML markup for ingestion into the ACS Team archive. [P7.9] OPUS: A CORBA Pipeline for Java, Python, and Perl Applications Walter Warren Miller III, AURA James F. Rose, CSC/STScI Michael S. Swam, AURA/STScI Christine Heller-Boyer, AURA/STScI John Schultz, AURA/STScI With the introduction of the OPUS CORBA mode, a limited subset of OPUS Applications Programming Interface (OAPI) functionality was cast into CORBA IDL so that both OPUS applications and the Java-based OPUS pipeline managers were able to use the same CORBA infrastructure to access information on blackboards. The primary motivation for doing so was to improve scalability, but moving to distributed object architecture also freed the managers from running strictly on a supported platform with access to a common file system. It also reduced the amount of duplicate code that otherwise would be required in a multi-programming language environment. Exposing even more of the OAPI through CORBA interfaces would benefit OPUS applications in similar ways. Those applications not developed in C++ could use CORBA to interact with OPUS facilities directly, providing that a CORBA binding exists for the programming language of choice. Other applications might benefit from running 'outside' of the traditional file system-based OPUS environment like the Java managers and, in particular, on platforms not supported by OPUS. The enhancements to OPUS discussed in this paper will illustrate how this generality was achieved, and present two examples on how to construct OPUS internal pollers in Java and Python. [P8.1] Efficient Distribution of Computational Load on a Bewoulf-Like Cluster Luca Fini Marcell Carbillet, Osservatorio Astrofisico di Arcetri. Firenze, Italia The CAOS Application Builder is a Graphical Programming Environment which allows the building of complex simulation applications for Adaptive Optics systems, by putting together elementary blocks [1,2,3]. The resulting simulation programs are often very heavy in computational needs and could be profitably run on bewoulf-like clusters, provided the computational load can be efficiently distributed on the CPU's. In the paper we describe a project to provide the CAOS Application Builder with software tools which allow the user to optimize the distribution of blocks on a multi cpu machine and show a few preliminary results. [1] L.Fini, et. al. in ADASS-X ASP Conference Series, Vol. 238, 2001. F. R. Harnden Jr., F. A. Primini, and H. E. Payne, eds. pp. 253-256 [2] M.Carbillet, et. al. in ADASS-X ASP Conference Series, Vol. 238, 2001. F. R. Harnden Jr., F. A. Primini, and H. E. Payne, eds. pp. 349-352. [2] S.Correia, et. al. in ADASS-X ASP Conference Series, Vol. 238, 2001. F. R. Harnden Jr., F. A. Primini, and H. E. Payne, eds. pp. 404-407 [P8.3] SIRTF Mosaicker David Makovoz, Khan Iffat We present a software system for image coaddition/mosaicking that is being developed for the SIRTF mission. SIRTF mosaicker features the use of the drizzle interpolation technique, robust outlier detection based on spatial and temporal filtering, and fast direct plane-to-plane coordinate transformation. It is designed to interface with other tools developed at SSC, such as pointing refinement and overlap consistency, which will improve the quality of the mosaic images. [P8.4] MacOSX for Astronomy F. Pierfederici, N. Pirzkal, R. Hook MacOSX is the new version of the Unix based Macintosh operating system. It features a sleek, high performance, display PDF user interface, sitting on top of a standard BSD UNIX subsystem. Consequently, this OS empowers users with a broad range of applications previously non available on a single system such as Microsoft Office and Adobe Photoshop, as well as legacy X11-based scientific tools and packages (IRAF, SuperMongo, MIDAS, and Skycat). This combination of a modern GUI layered on top of a familiar UNIX environment paves the way for new, more flexible and powerful astronomical tools being developed while assuring compatibility with already existing, older programs. In this paper, we outline the strengths of the MacOSX platform in a scientific environment, Astronomy in particular, and point to the numerous astronomical software packages available for this platform; most notably the SciSoft collection. [P8.5] The Fasti Project C. Baffa, INAF - Osservatorio di Arcetri V. Biliotti, INAF - Osservatorio di Arcetri A. Checcucci, INAF - Osservatorio di Arcetri V. Gavrioussev, IRA - CNR S. Gennari, INAF - Osservatorio di Arcetri E. Giani, INAF - Osservatorio di Arcetri F. Lisi, INAF - Osservatorio di Arcetri G. Marcucci, Firenze University M. Sozzi, IRA - CNR Fasti is a controller architecture originally developed for fast infrared astronomical array detectors, and intended to be powerful and extendible. It is suitable to be used with both DRO and CCD detector and it is also well suited for very fast optical detectors, as those used in Adaptive Optics. In the framework of LBT project, a L$^3$CCD version is in development. [P8.6] The USNO-B Catalog David Monet Stephen Levine, USNO Flagstaff The USNO-B catalog presents positions, proper motions, magnitudes in various optical passbands, and star/galaxy estimators for 1,036,366,767 objects derived from 3,633,655,848 separate observations. The data were obtained from scans of 7,435 Schmidt plates taken for the various sky surveys during the last 50 years. A brief discussion of various will be presented, but the actual data are available from www.nofs.navy.mil and other sites after September 2002. [P8.7] Infrared-Array-Camera Images from the Space Infrared Telescope Facility Russ Laher Jason Surace Heidi Brandenburg Mehrdad Moshir The Infrared Array Camera (IRAC), one of the science instruments on NASA's soon-to-be-launched Space Infrared Telescope Facility (SIRTF), has four simultaneous-imaging, focal-plane-array detectors with optical filters covering different near-infrared spectral regions: 3.6, 4.5, 5.8 and 8.0 m. IRAC digital images of the celestial sky will be computer-processed in several stages of a production pipeline for instrument-artifact removal and scaling to absolute intensity units. Ancillary uncertainty and pixel-condition-flag images will also be generated for each processed IRAC image. Special processing of calibration data will be done prior to applying it in the production processing. Telescope-pointing data will be separately processed and used to assign a sky position and orientation to each image. Time-sequences of images will be processed to create pixel maps of unwanted latent-image artifacts. Images overlapping the same sky region will be co-added to mitigate noise, put together to form sky mosaics much larger than IRAC's footprint on the sky, and further processed to yield intensity point-source information on the celestial objects that are imaged. Pixel maps of outliers in the image data will also be generated. Reduced data from IRAC's four infrared will be merged to facilitate scientific analysis. The data processing will be done at the SIRTF Science Center, first in real time and then in subsequent processing episodes to further refine the data products. Selected versions of the data products will be archived and made accessible to astronomers worldwide