Real-Time Image Processing Requirements

advertisement
Real-Time Image Processing Requirements
For the Synoptic K-corona Imager (SKI)
This document introduces the SKI coronagraph and some of the image processing requirements
for the instrument.
Introduction:
A coronagraph images the solar corona. Even at the best sites in the world, the sky (scattering in
the Earth's atmosphere) is many orders of magnitude brighter than the corona. The reason a
corona is easily visible during a solar eclipse is the Moon blocks sunlight from striking the Earth’s
atmosphere above the observing site. A coronagraph typically forms an image of the Sun and
the surrounding sky using a simple lens. At the focus of the lens the bright light of the solar disk
is diverted or absorbed by an ‘occulting disk’ which just barely covers the solar disk, making an
artificial eclipse. Although the occulting disk blocks the disk light, the Earth’s atmosphere is still
fully illuminated by the Sun and therefore the bright sky must be suppressed in some other way.
This is possible because the light from the corona is polarized.
A bright ring around the occulter
is common.
Blue: bright sky
background
Blue: bright sky
background
n
d
An occulting disk
blocks the direct
sunlight
Grey: faint coronal
light
background
Bright wandering
aerosol
n
d
Figure 1: A cartoon of a coronagraph image. The black disk is the occulter blocking the disk of the
Sun. A bright ring is often seen just above the occulter (due to diffraction and other effects). The
corona (shown in grey) is ~10-4 of the brightness of the sky background (blue). Bright aerosols (dust,
bugs, etc.) will often enter the field of view. These should be removed in real-time processing.
At the edge of the SKI field of view, the sky is ~104 times brighter than the corona. Imaging the
corona therefore requires separating its faint (but polarized) light from this huge background (the
sky is also polarized, but this can be modeled and subtracted). To do this the light passing
around the occulting disk is sent through polarization optics. These optics separate the light into
two beams and each beam is refocused onto a camera. The polarizing elements are electrooptical and the light passing through them is ‘modulated’ at very high speeds (equal to the
camera frame rate). This modulation is required to determine the polarization of the light. In SKI
there are a total of 4 modulation ‘states’. The image processing system must read each of the
cameras and process the images produced in each of these four states separately (there are
therefore a total of eight frames to process: 2 cameras x 4 states).
To see a signal which is 10-4 of the sky brightness the sky must be measured with enough
photons such that the shot noise is suppressed to this level or better. Since shot noise goes like
the square root of the signal, at least 108 photons must be read by the cameras. Modern CMOS
cameras of the size required to image the field (~1k square) have well depths on the order of
60ke-. To accommodate bright objects in the field the CMOS camera should have a nominal
20% exposure, filling each well to ~10ke-. At this exposure, 10,000 frames must be read to have
enough photons to see the coronal signal. A 1k x 1k camera running at 500fps (near the limit of a
Camera Link interface) will acquire this number of images in 20 seconds. This is also the time
cadence which is required to capture the evolution of flares and other quickly-evolving solar
events. This number of photons is easy captured with a ~20cm telescope. The problems are the
cameras and the real-time processing requirements.
For the system to work, each camera image must be shot-noise limited. At 10ke- per read, this
requires a read noise significantly below 100e-. For the digitization noise to be less than the shot
noise at least 10 bits are required with 12+ being desirable. Polarimetry also requires that the
camera operate in a ‘global shutter’ mode. All of these requirements seem reasonable with
modern CMOS cameras. Of important note is that we do not need to store images at 1000fps.
For scientific and diagnostic needs, the processed images only need to be written to disk every
250ms or so (a complete set of eight polarimetric images stored every 2 seconds). Because the
Sun will be observed for ~8 hours continuously, the processing of the 1000fps data into 4fps
stored imagery must be done in real time. In principle the processing system would only need to
sum the images of each state into buffers. In practice there is additional processing which we
would like to do in real time: the removal of bright aerosols.
Aerosol Removal:
Aerosols are particles floating in the air. Insects, pollen, and airborne seeds all scatter light very
strongly. When they pass in front of the telescope’s aperture these appear as bright, fuzzy blobs
which wander across the field of view (see Figure 1) in times scales of ~1 second (depending on
wind, etc.) If the individual images are simply averaged, these show up as bright streaks in the
data. The optimal solution is to remove these bright objects from each image before they are
accumulated into their state buffers.
As an example, consider the following algorithm: An independent video processor & computer is
used for each camera. Each camera runs at 500fps where the polarization modulation state is
changes after each read (so the read sequence is state 1, 2, 3, 4, 1, 2, 3, 4, etc.) Each
modulation state requires 3 buffers: a sum frame, running mean frame and a ‘good read’ frame.
The sum frame stores the image and is initialized to zero each time it is dumped to disk. Before a
read is added to the sum buffer, each pixel is compared with a few second long running mean,
stored in the running mean frame. If a pixel read is ‘high’ compared with the running mean, it is
tagged as containing an aerosol and set to zero. A ‘good read’ frame counts how many good
reads are recorded for each pixel. Before writing to disk, the sum buffer is divided by the good
read frame to make an average image (after which the good read frame is re-initialized). The
running mean frame is updated with the processed image. The algorithm requires an initialization
of the running mean frame before filtering can be started.
There are many possible algorithms for aerosol removal. For example, the running mean could
be replaced by a reference image taken at intervals over the day. If it is computationally too
intensive to remove aerosols on a read-by-read basis the problem can be scaled down to a level
where the removal is possible. For example, two sequential reads (in the same state) can be
simply added then the above algorithm performed. If this is still too intensive then three frames
could be summed before aerosol removal (and so on).
Download