PH512 Lecture 2: Grids Professor Michael Smith MULTIMEDIA ASTRONOMY School of Physical Sciences Convenor Prof. Michael Smith Taught in Spring Term Kent Credits 15 at Level H 1 PH512 ECTS Credits 7.5 Grids & Virtual Observatories (VOs): Terabytes. Astronomy faces a data avalanche. Breakthroughs in telescope, detector, and computer technology allow astronomical surveys to produce terabytes of images and catalogs. (one terabyte comprises 1000 gigabytes) These datasets will cover the sky in different wavebands, from gamma- and X-rays, optical, infrared, through to radio. In a few years it will be easier to "dial-up" a part of the sky than wait many months to access a telescope. Astrogrid http://www2.astrogrid.org/ Allows remote processing within the VO infrastructure, which facilitates data access across archives and integrate a large number of analysis tools AstroGrid: UK government funded, open source project designed to create a working Virtual Observatory for UK and International astronomers. Working closely with other VO projects worldwide through the International Virtual Observatory Alliance (IVOA), internationally recognised interface standards are emerging which promote true scientific interoperability of astronomical data and processing resources worldwide. The US National Virtual Observatory PH512 Lecture 2: Grids Professor Michael Smith 2 http://www.us-vo.org/ The European Virtual Observatory EURO-VO http://www.euro-vo.org/pub/fc/software.html Pipelines & Reduction Packages: Astro Data Analysis Systems are: IRAF, MIDAS, Starlink, AIPS, Image Analysis tools include: IDL, gimp, PhotoShop, MATLAB, ……. When the first major data analysis systems for optical astronomy were developed in the 1980s they were a huge success. For the first time a large body of applications were available which worked together. The environment imposed a standard data format (MIDAS BDFs etc), a useful scripting language (the IRAF CL perhaps), lots of useful graphics and a standardised application interface. Many people adopted one and worked "within" it for much of their processing. The overall effect on data handling efficiency was strongly positive. Although much reduction and analysis is still done this way the situation has evolved dramatically. The standardisation which was once imposed by the environment is now coming from outside. FITS files are universal for data on disk. The Spitzer mission has not adopted a standard environment. It has a powerful pipeline system and a small number of specific tools (Mopex for example, a flexible mosaicing system, and SPICE, a spectral tool working on pipeline products) for post-pipeline analysis. PH512 Lecture 2: Grids Professor Michael Smith 3 Pipelined Data: Indeed most major new astronomical facilities now put much more effort into trying to deliver high quality data products from pipeline processing rather than into the exploitation or development of software environments. ESO successfully went this way, as did Hubble and more recently Spitzer. Although this approach doesn't normally improve on "hand-crafted" data products it is catching up for example Hubble will soon offer combined dithered imaging data as a pipeline product. On-the-fly calibrated Hubble data is now taken as the best available for many uses. Going beyond standard pipeline processing, many valuable data sets are now processed to create optimised "high-level data products" which are ready for immediate science. The development of more and more effective pipelines means that one application of the original data analysis environments, data reduction, is now largely redundant. The pipelines can use whatever software architecture fits the observatory delivering the products, without affecting the utility of the (standardised) data subsequently analysed by external users. More and more data is taken from an archive in processed form rather than processed by astronomers themselves. High level data products - such as the Hubble Ultra Deep Field images for example - are of a much higher standard than most astronomers could process for themselves and are available for dramatically less effort. The move towards the VO continues the trend and is generating more universal standards for data exchange (VOTable) and remote data access (SAIP and the shift in front end processing from standard analysis systems to PH512 Lecture 2: Grids Professor Michael Smith 4 pipelines and high-level data products is complemented by a major change at the backend - the inspection, measurement and analysis of final data products. In this case the Virtual Observatory initiatives are creating a small, but rapidly evolving set of powerful tools which allow scientific analysis either online from remote datasets, or by running tools on remote servers. In future such tools, similar to and developed from current pilot examples such as Aladin and the AVO Prototype, will gradually take on many of the important analysis tools. Aladin Aladin An interactive software sky atlas allowing the user to visualize digitized images of any part of the sky, to superimpose entries from astronomical catalogs http://aladin.u-strasbg.fr/aladin.gml Skyview SkyView is a Virtual Observatory on the Net generating images of any part of the sky at wavelengths in all regimes from Radio to Gamma-Ray. http://skyview.gsfc.nasa.gov/ Subaru Subaru Image Processor: Makali`i is the software which can analyze the FITS image (data) obtained by the research observation including the Subaru Telescope. If it is non-commercial educational/out-reach/research purpose, anyone can use Makali`i freely. http://makalii.mtk.nao.ac.jp/index.html.en * Environment of operation: o Windows OS ( Windows 98/2000 /Me/XP ) * The main functions of Makali`i: o Aperture Photometry (measuring the brightness of stars) o Astrometry () o Graph (makeing a location-count graph) o Arithmetic Operation on Image(s) (addition, subtruction, multiplication, division, ...) o Blink (displaying multiple images alternately) PH512 Lecture 2: Grids Professor Michael Smith etc. 5