NGAO discussion: Science Operations NGAO Meeting #4 D. Le Mignant 22 Jan. 2007 Outline 1. Words of introduction on the topic 2. Lessons learned 1. 2. 3. 4. Weather impact Efficiency - weather removed The good things here at Keck The good things .. elsewhere 3. Top-level goals for science operations 1. Science return 2. Facility-class 3. Long-term operations 4. Observing efficiency (and uptime) 1. Proposed definitions 2. A spiritual agreement 5. Observing models and science data products 1. 2. 3. 4. 6. Instrument calibrations, maintenance and performance monitoring Automation and flexibility during the observations Facility science calibrations Planning for science beyond our reach Conclusion and Discussion 2 Science operations: an important point.. too! • Science Operations Design choices have a very major impact: – – – – – • On the design/cost of the AO/Laser and science instruments On the observing model for the observers On the science return On the observatory budget 2010-2020 (see NSF review) On the Observatory future priorities Work in progress – WBS 2.2.3 Observatory Requirements: • Science Operations Requirements – WBS 3.1.1.11 System Design Approach / System Engineering / Performance Budget • Observing Efficiency and Observing Uptime – WBS 3.1.2.1.10 System Design Approach / System Engineering / Trade Studies • Observing Models TS – WBS 3.4.2 System Design Approach / Science Operations • AO-Instrument operations 3 Science Requirements Observing ScenariosLessons Learned… Observatory Rqts Science Operations System Design Approach System Engineering Performance budget Trade Studies Observing Efficiency Budget Observing Models TS Science Operations 3.4.2.1 AO-instruments Ops Functional and Performance Requirements 4 Lessons Learned 101 nights of Keck II LGS AO ops since Nov. 04 till Jul. 06 5 Lessons learned: weather • From our statistic: ~25% of the allocated time ~18 % are entire nights close dome situation ~ 7% marginal weather affecting 1/5 of the open-dome nights -> Only ~60% of the nights are ~photometric nights • Looking at other statistics: - Study for New Initiatives AURA office (Erasmus and van Staden) ESO site search (ESPAS, 2003) - - www.eso.org/gen-fac/pubs/astclim/espas/espas_reports/ESPAS-MaunaKea.pdf Kaufmann and Vechhione (1981) Observing quality Photometric “Spectroscopic” unusable Usable Frequency of occurence (%) 45 - 55 17 - 22 20 -25 ~ 70-75% 6 Kaufmann and Vecchione (1970-1978) 7 Weather impact: • From ESPAS and AURA report on Mauna Kea: – Usable nights with LGS < 75%: • • • – Photometry: • • – Seeing is considered the best for the existing astronomical sites (0.5”) Seasonal trends? Global warming (long term trend) • • A few nights (~3-4%) are affected by very strong winds Winds ENE-ESE in the summer, more W and SW in winter Seeing • • – Best months are Jan and Feb. Worst months are October and April Winds: • • – About 55% of the nights are >6h of photometric conditions) An additional fraction 10-15%? are likely usable with LGS (light cirrus) The maximum of usable time is around 2/3 of the time (240 n/yr) 3.7deg / century for Mauna Loa Should the science operations takes this information in consideration and study the feasibility of more flexible schedule? – – Increase observing efficiency Reduce observing support load (?) 8 Lessons Learned: overheads are too much 1. 2. 3. 4. 5. 6. 7. Ref: 2006 SPIE papers and some Keck internal discussion for K1 LGS AO LGS AO checkout 30min/night Telescope slew and pointing Target ID and centering LGS AO readiness 5 - 10 min/target LGS AO optimization 2min per hour on target Telescope/AO handshakes 30+ sec per dither Scientific instrument setup and readout Observing strategy 9 Lessons learned: Observing Efficiency • • Keck NGS AO observing efficiency for nights w/o weather or technical problems at best vary from 25% (snapshot surveys, Lp and Ms obs) to 60-80% for deep-exposure science programs. LGSAO shows roughly the same values, except that it is more impacted by weather and technical problems For a reliable system in good weather conditions, we are currently mostly limited by 1. Serial (vs parallel) algorithms (DCS /inst/AO) during observations 2. Under-designed telescope pointing and acquisition systems 3. Under-designed AO nodding/dithering hardware and software 4. Under-designed science instrument readout 5. Aging (WFC) and/or complex instrumentation (laser) 6. Under-designed ancillary systems (photometry, seeing, PSF, etc) 7. Minimal maintenance, calibrations and performance monitoring for science instruments, AO and laser 8. Operations (Laser traffic rules, overall cost including energetic) 10 Lessons learned: some Keck goodies… • A flexible and (rather) small community – – • Science Instruments – – – • Ability for observers to try new observing modes (“push the limits”), and/or calibrate for problems they discover. Possibility to script for simple instruments like NIRC2 New generation of observing software with OSIRIS: OPGui and DRP AO / Laser – – – • Ability for observers to combine their observing time Close interaction between observers and support staff A best-effort / shared-risk science mode Ability to optimize (and keep optimizing..), except for the telescope. Proximity to the instruments for troubleshooting Community Experience: – – – Development, integration of new concepts on large telescope (Keck!, LGS, etc) New-generation of instrument: OSIRIS, etc Archive (KOA) 11 Lessons learned: other observatories’ goodies • A flexible scheduling of the instruments: – – • Instrument failures has less impact on observing efficiency Management of observing and eng. time w.r.t. observing conditions Instrumentation Management – – Strict review for integration, testing and commissioning processes High-quality maintenance and calibrations • • • All instruments have maintenance schedules with archived calibrations Easier traceability of problems/trends Scientific Operations – Integrated operations • • • – Facility calibrations • • – – – Automated telescope acquisition (Magic?) Observing planning tool (obs. seq., AO/inst. config, performance estimate) Observing sequence templates for instruments / telescope Flat-field, astrometry, photometry, etc Ancillary data are archived Archived science data for long term use Data reduction pipeline and support. Service observing ?? 12 Science Requirements Observing Scenarios Observatory Rqts Science Operations System Design Approach System Engineering Performance budget Trade Studies Observing Efficiency Budget Observing Models TS Science Operations 3.4.2.1 AO-instruments Ops Functional and Performance Requirements 13 Science Operations Goals for the Next Generation of AO instruments at Keck (2012 - 2020) The spirit: “Maximize the scientific return of the allocated observing time with the NGAO instruments from 2012 to 2020” Top-level goals: 1. More than 80% of the time allocated is spent on collecting science-quality data. 2. The NGAO system combined with its science instruments is a facility-class instrument. 3. The Observatory is capable of supporting the equivalent of 240 nights/year for NGAO science operations. …btw, are we including the interferometer?..!-) 14 Some flow-down thoughts 1 - More than 80% of the time allocated should be spent on collecting science-quality data. 1. 2. 3. 4. 5. Software should permit simultaneous commands to multiple sub-systems, as well as within a sub-system The time allocation method and the Keck science operations model should minimize the average impact due to lost time from bad observing conditions year-round A heck of an efficient and skilled astronomer! > 80% observing efficiency & uptime > 98% ? Science instrument performance (image quality, sensitivity, observing efficiency and calibration stability) are documented, simulated, and monitored by the observatory on a routine basis. 1. 2. 3. 6. The simulation tools should provide key-parameters with a 10 % accuracy within a range of observing conditions and instrument setup. The relative astrometry solution error over the field of view should be less than 2% of a pixel. The pointing and positioning accuracy error on the science field should be known with an error less than 20% of the measured FWHM. The relative and absolute photometry should be monitored throughout the observations with a precision of xx % at the observed wavelength. Etc 15 Some flow-down thoughts 2 - The NGAO combined with its science instruments is facility-class instrument. 1. Facility-class has many implications on safety, operability, reliability, maintainability, lifetime, documentation, configuration management, etc. 2. Sustainability 1. Sustainable development program for the instrument? 2. Observatory cost for the operations (including energetic cost) 3. The Mauna Kea laser projection requirements must be satisfied 4. The NGAO installation, integration, testing and commissioning phases should follow the highest Observatory standards and be reviewed by the Keck community: 5. Interferometer- & Ohana- related requirements: a) NGAO should support IF and Ohana science operations b) The interferometry modes should not require the NGAO light path for optical alignment in the basement. 16 Some flow-down thoughts 3 - The Observatory is capable of supporting the equivalent of 240 nights/year for NGAO science operations. 1. 2. 3. 4. Auto-calibration for NGAO performed by non AO-experts. < 30 minutes of daytime telescope restriction on a science night. < 30 minutes per observing night for maintenance and science instrument performance calibrations. Observing tools: 1. 2. 5. Simulation module (telescope, natural and laser stars) for calibrating and troubleshooting the AO system, as well as a stable and accurate calibration module (wavelength, flat-field, field distortions, sensitivity) for the science instruments. Atmospheric parameters and system diagnostics for image quality monitoring and PSF reconstruction. A heck of a observing support team: 1. 2. 3. Expandable and flexible library of observing sequences for each instrument/type of science/etc. Support astronomer review the observing sequences prior to the observations. The SA will be on-call and may be present during the observations. The science instrument should be operated remotely or on-site by one Observer or a Support Astronomer. The telescope, the AO & laser facility should be operated by two or less Observing Assistants (or equivalent skills). 17 Science Requirements Observing Scenarios Observatory Rqts Science Operations System Design Approach System Engineering Performance budget Trade Studies Observing Efficiency Budget Observing Models TS Science Operations 3.4.2.1 AO-instruments Ops Functional and Performance Requirements 18 Observing Efficiency: we care defining it! The Observing Efficiency is the open shutter time during dark time when the instrument runs at the performance level where it is designed to operate, for the given observing conditions. • We care for loss of science return - no matter what! – 1. 2. 3. Marginal weather accounting: time loss or not, NGS AO data useful or not, etc Idea of allocating few nights by TAC: highest ranked proposal gets best conditions, etc Understanding observing efficiency for NGAO may require tagging the various sets of data (science, calibrations, etc). 1. 2. 4. Observing efficiency is linked to science return during the allocated time (hence weather and seeing impact should be considered)? Which part of the calibrations are part of the science-quality data? Need for facility calibrations Efficiency per brain cell (the best use of everyone's time by not having to re-invent the wheel over and over). 19 Observing efficiency and uptime (2) A spiritual agreement 1. The time allocation method and the Keck science operations model should minimize the impact due to lost time year-round. 2. Quicker, better slews, setups and moves: 1. 2. 3. 4. Smaller telescope pointing error (<2 arcsec?), fast telescope slews (xdeg/min). Faster target acquisition and accurate centering on science array Faster dither, parallel readout, faster on-line data viewer. Fast chopper (few Hz) for thermal NIR imaging? Software permits simultaneous commands to multiple sub-systems, as well as within a sub-system, in order to minimize time overhead during telescope dither, telescope slew, target acquisition, instrument setup, instrument data readout, etc. 3. The Observatory manages instrument calibrations & performance at a high-quality level. 4. The Observatory provides simulation, planning tools, observing templates, etc. 5. The PIs are responsible for performing their observations, given these tools. • • Mean-time-between **any failure mode** should be > 3 hour? Instrument uptime should remain higher than 98% through the night (12 min)? 20 Science Requirements Observing Scenarios Observatory Rqts Science Operations System Design Approach System Engineering Performance budget Trade Studies Observing Efficiency Budget Observing Models TS Science Operations 3.4.2.1 AO-instruments Ops Functional and Performance Requirements 21 Observing Models Preliminary thoughts… Scheduling Mode Ownership Mngmnt Science Programs Weather Impact Backup options Classical night/night PI PI 1 .. 3 flexible high NGS AO NIRSPEC Night blocks per TAC TAC TAC/Keck 1… 5 flexible Medium to Low NGSAO NIRSPEC DEIMOS Engin.? NGAO-only Queue PI Keck 1… 5 flexible Low Iden. Keck Queue ?? PI Keck 1..10 Flexible Low Iden. 22 Science data-products Preliminary thoughts… Science + Specifics Calibration Support Ancillary data Data reduction Archive Monitoring Classical observer Observer no minimum No So-so Plus Observer Observer Archived Yes minimum Calibrations Advanced Observer Observatory Yes Minimum calibrated Cals+science performance Service Observatory Observatory Yes calibrated Cals+science performance 23 Conclusion • Next steps: 1. Help build the Observing Scenario Spreadsheet 2. Discussion and feedback: observing modes and science operations are not very well documented. 3. Propose to delay: 1. 2. 4. From the WBS Dictionary (3.4.2.1): Define the overall architecture, the method and design the interfaces for operating the sub-systems of the NG AO-instrumentation. Here AO-instrument refer to AO, laser, SC, science instrument, etc. Discuss operations team's strategy for structuring the remainder of the SD phase tasks under 3.4.2.1 More work in many areas. One main challenge: coordinating the work with Keck and the science community, while staying in the numbers of allocated hours. 24