RIA Training Report and Evaluation SPACE TELESCOPE

advertisement
SPACE
TELESCOPE
SCIENCE
INSTITUTE
Operated for NASA by AURA
RIA Training Report 2008 Part III
RIA Training Report and
Evaluation
Tiffany Michelle Borders
Space Telescope Science Institute
October 24, 2008
ABSTRACT
This document serves to provide an evaluation of my RIA training activies as well as provide
suggestions for future enhancements.
Introduction
This document presents a report on my RIA training experience. Training is an important part of the new hire process and this document is intended to share some of my
results, feedback, and provide enhancements for future training. The goal of the training
program is to help ease the trainee to work at STScI and the RIA group. My training activities consisted of Zope training and exercise, introduction to the HST archive, introduction
to IDL scripting, IRAF/PyRAF/Python, general help desk training, photometry with HST
and intro to drizzling.
c 2008 The Association of Universities for Research in Astronomy, Inc. All Rights Reserved.
Copyright HST Websites and Zope Training
In this training session, hosted by Tyler Desjardins, he showed me a list of useful websites
on the STScI internal page. I thought this was a good introduction to the internal pages
and to get acquainted with where various links are located. I thought the Zope exercise
of creating a webpage about our favorite topic in astronomy was a good idea and a good
exercise for trainees. I created a page about the research I did for my Masters thesis and
got to learn some html, css, and put my page onto Zope. My results can be found at:
http://www.stsci.edu/institute/org/riab/activities/training/zope exercise/borders
Since my Zope training Tyler has made a number of additions to his documents including some of my suggestions regarding setting up your Zope account. I found this training
informative and with the revised documentation I think this aspect of the training is in good
shape.
HST Archive and Calibration
In this training session, hosted by Michael Wolfe, I got introduced to the HST Archive
as well as a basic calibration of STIS data. Michael gives a good overview to using both
MAST and STARVIEW. I however, prefer to use MAST. I ended up going to this training
session twice as the second time around Michael provided an assignment. In this assignment I retrieved a STIS data set (ID 10175, PI Charles Proffitt) and calibrated the data
in three different ways. The first calibration is to be done by running CALSTIS in IRAF
(STSDAS/hst calib/STIS) with the retrieved data and using the best reference files from
the archive. This can be done by typing this command:
CALSTIS *.raw.fits
The second calibration run involved running CALSTIS but this time using the best reference
files from CDBS. In order to do this the calibration reference files in the header need to
not contain the word ‘OREF’. To do this I use the HEDIT task in IRAF and change the
value so that‘OREF’ is not included. Using HEDIT the keyword ‘images’ needs to be set to
*.wav.fits[0], *raw.fits[0] so that both the *wav.fits and *raw.fits get updated simultaneouly.
Also in HEDIT the keywords ‘field’ needs to contain the field that needs to be changed (i.e.
BPIXTAB, DARKFILE, etc.) and ‘values’ needs to be set to not include the word ‘OREF’
but does contain the file that is set there already. The third calibration run involves running
CALSTIS with data that Michael has supplied. In this case, we are doing the calibration step
by step and need to turn the calibration switches in the header using HEDIT either off or on
2
with the ‘value’ OMIT or PERFORM. Using the “Data Flow Chart Through CALSTIS”,
which Michael has provided, we do each part of the calibration separately by changing the
switches and eventually creating flt’s, x1d’s, and x2d’s.
This assignment is tedious however, once you understand what is going on and how
CALSTIS works it is useful. The instructions are very minimal and so lots of questions need
to be asked to understand what you are trying to achieve. I believe this assignment was
written this way on purpose. My suggestion is that possibly a little more clarity could be
added to the assignment. I found this assignment useful when I needed to run CALACS for
the photometry assignment. Even though this was for a different instrument I already had
a general feel for what needed to be done.
IDL
The IDL training session was hosted by Pey-Lian Lim. I really liked this introduction to
IDL. Pey-Lian had material and examples prepared and that we got to sit at my computer
and try/test things out in IDL. I thought it was really helpful that Pey-Lian had me type the
commands rather than just show them. The exercise that Pey-Lian created was at a good
level. It was challenging and I learned how to do new things in IDL but it wasn’t ridiculously
difficult. I thought Pey-Lian’s tutorial and exercise was well planned and I enjoyed learning
about IDL.
This assignment required us to:
1. Write a script with a function to plot something you fancy with filled circles.
2. Write a separate script with a procedure to call the above function. Your procedure
can also write some text to an output file.
3. Compile, debug, and run the procedure
4. Save the displayed plot to a .PNG file
5. Use MULTIPLOT to plot two plots that share the same x-axis
Appendices A, B and C contain the three .pro programs I wrote to complete this assignment.
Appendix A contains the program which sets up a function which will be used to plot. I
have chosen my function (wave) to be a sort of DeBroglie wave packet. Appendix B contains
a program which plots my wave function using filled circles and also uses MULTIPLOT to
plot two plots sharing the same x-axis. Appendix C contains a program which is used as
3
Fig. 1.— My plot result from IDL assignment.
a procedure to call the function wave and compute the min, max, and number of elements
and also writes this out to an output file. This program also creates a .ps output file for my
plot and creates a .PGN file of my plots. Figure 1 shows the results of the plotted .PNG file
I created with my IDL scripts and the output of this code is:
Minimum Value
-1.98906
Maximum Value
1.98722
Number of Elements
1000
Intro to IRAF/PyRAF/Python
This training session was hosted by Alex Viana. In the end, I found this training session
extremely useful. I ended up attending this training session twice as the second time around
Alex created a python example for us to try. For the sake of our hands on data project I
found this training session to be one of the most useful. I found writing python scripts for
PyRAF incredibly easy and incredibly helpful. Appendix D contains the results of writing
my very first python script. Alex’s documentation walks the trainee step by step through
4
creating this script which is very helpful. I think that the python program we create is a
very good start for someone (like me) who is just starting to learn python.
General Help Desk
This training session was hosted by Brittany Shaw. For this training session she gave
a presentation introducing us to the General Help Desk and how to respond to calls. At
the end of the presentation she gave us a Help Desk call worksheet which contained some
sample questions and we needed to decide how we would assign these calls. I thought this
was great because it made us think about how we would go about assigning calls which is
exactly what we do when working the Help Desk. Since Brittany gave this talk a few things
have changed pertaining to running Terminal services and hopefully for the next round of
trainees Brittany will incorporate the new changes to her presentation.
MultiDrizzle
This training session was developed by Jennifer Mack. The results of the MultiDrizzle
assignment are located in Part I of my RIA training report. I have provided comments and
feedback on this assignment to Jennifer directly as I’ve gone it so this report is going to
just add a few comments. I thought it was a good learning experience to be exposed to
MultiDrizzle as well as have the opportunity to experiment with writing python scripts to
automate this process. As far as the MultiDrizzle presentation is concerned I do think that
more emphasis should be placed on understanding the flags for the DQ array (bits). This
is very important for understanding how to change some of the parameters in MultiDrizzle
and I think the trainee should have a good understanding of this (especially because it is
really confusing!). The example which I worked from has undergone a lot of changes since I
started going through it. I do think now the documentation is in good shape and I like that
it goes through a simple example as well as a more complex example.
Photometry
The results of my photometry assignment are located in Part II of my RIA training
report. I enjoyed this assignment and felt it was really important and useful to understand
how photometry needs to be conducted on HST data. I also found the results of the project
very interesting. I think that for the purpose of this exercise this example is good. I
5
did find, however, that I was not satisfied with the results of DAOFIND and instead used
SExTractor. The results of SExTractor produced a much deeper and cleaner CMD. I don’t
think it’s necessary to change the assignment but you could write something in here that
suggests if the trainee wishes they can explore other options such as SExTractor. I also
found this to be a good opportunity to explore python scripting. Doing the CTE correction
allowed me to learn more about programming with python which I found very useful.
Something to consider for the next set of trainees is a possible different set of data to
use for the photometry. It might be interesting to see a different CMD presented from the
next set of trainees. If it doesn’t require too much work it may be worthwhile to investigate
using another globular cluster. A suggestion is NGC 6397 (ID 10257) which is observed with
ACS in F435W, F625W and F658W. The photometry assignment was my favorite part of
the training. I would also like to say that overall, this training was a positive experience.
Acknowledgments
I like to thank all of those who contributed to my training experience including Jennifer
Mack, Max Mutchler, Michael Wolfe, Pey-Lian Lim, Alex Viana, Brittany Shaw and Tyler
Desjardins. I would also like to thank Francesca Boffi for hiring me for this position in which
case I am very happy to be part of the RIA Branch.
6
Appendix
A. IDL Script 1
function wave,t
; This program sets up a function called wave which will be used to plot.
; The function is a sort of DeBroglie wave packet.
wave = sin(t/7)+sin(t/8)
return, wave
end
B. IDL Script II
pro plot_func
;This program is used to plot my function wave.
window,0,retain=2
;retain=2 so graphics don’t get wiped out.
;This is how you set up filled circles for plotting.
A=FINDGEN(17)*(!PI*2/16.)
USERSYM,COS(A),SIN(A),/FILL
;; This is to make things colorful
device, decomposed=0 ; Indexed color
loadct, 5 ; This is commonly used
;Using MULTIPLOT to plot 2 plots that share the same X-axis.
MULTIPLOT,[1,2]
t=findgen(1000)
plot, t,wave(t),PSYM=-8,title=’deBroglie Waves’,color=50
MULTIPLOT
plot, t,wave(t)+!PI,PSYM=-8,xtitle=’time(ns)’,color=65
7
;Reset system parameters
MULTIPLOT,/reset
end
C. IDL Script III
pro do_func
; This program is used as a procedure to call the function wave.
t=findgen(1000)
min_wave = min(wave(t))
max_wave = max(wave(t))
n_elem_wave = n_elements(wave(t))
print, ’Minimum =’,min_wave
print, ’Maximum =’,max_wave
print, ’Number of Elements =’,n_elem_wave
; This is creating an output file
openw,out,’output.dat’, /get_lun
printf,out,’ Minimum Value Maximum Value
printf,out,min_wave,max_wave,n_elem_wave
free_lun,out
Number of Elements’
; This is creating a .ps outputfile for the plot
set_plot, ’PS’
device, filename=’plot.PS’, /portrait, $ ; Portrait
xsize=8, ysize=10, xoffset=0.5, yoffset=0.5, /inches ; Letter size
plot,t,wave(t),PSYM=-8,title=’deBroglie Waves’,xtitle=’time(ns)’
device,/close_file
set_plot,’X’
;This is for creating a .PNG file
T=TVRD(0)
write_png, ’plot.png’, T
end
8
D. My first python script
######################################################################
#This is an example script shpwing how python can be used to execture
#IRAF/PyRaf tasks. The data comes from the NICMOS instrument and is
#located on central storage.
#
#Tiffany Borders- borders@stsci.edu
#Space Telescope Science Institute
#
#Last updated: 09/22/08
######################################################################
#Load the packages we need
import pyraf,os,glob
from pyraf import iraf
from iraf import stsdas, hst_calib, nicmos,noao,digiphot,daophot
#removes the ld coordinate file
#os.remove(’n94j06okq_cal.fits1.coo.1’)
#run daofind on one image
#this is another way of commenting
if 0==1:
iraf.daofind(\
image=’n94j06okq_cal.fits[1]’,\
interactive=’no’,\
verify =’no’,\
output = ’default’)
#make a counter
k = 0
#generate a list of all the fits files
file_list = glob.glob(’*_cal.fits’)
print file_list
for file in file_list:
#Test for old files
file_query = os.access(file + ’1.coo.1’,os.R_OK)
9
print file_query
#Removes them if they exist
if file_query ==True:
os.remove(file+’1.coo.1’)
#Run doafind on one image
iraf.daofind(\
image=file + ’[1]’,\
interactive=’no’,\
verify=’no’,\
datamin = 50.0)
k = k+1
print k
10
Download