Document 16083937

advertisement
EDGE-BASED PEAK POSITION SEARCH ALGORITHM FOR PET DETECTORS
Kun Di
B.S., University of Science and Technology of China, 1992
M.B.A., International University in Germany, 2003
PROJECT
Submitted in partial satisfaction of
the requirements for the degree of
MASTER OF SCIENCE
in
COMPUTER SCIENCE
at
CALIFORNIA STATE UNIVERSITY, SACRAMENTO
SPRING
2011
EDGE-BASED PEAK POSITION SEARCH ALGORITHM FOR PET DETECTORS
A Project
by
Kun Di
Approved by:
__________________________________, Committee Chair
Chung-E Wang, Ph.D.
__________________________________, Second Reader
Ted Krovetz, Ph.D.
____________________________
Date
ii
Student: Kun Di
I certify that this student has met the requirements for format contained in the University format
manual, and that this project is suitable for shelving in the Library and credit is to be awarded for
the Project.
__________________________, Graduate Coordinator
Nikrouz Faroughi, Ph.D.
Department of Computer Science
iii
________________
Date
Abstract
of
EDGE-BASED PEAK POSITION SEARCH ALGORITHM FOR PET DETECTORS
by
Kun Di
It is an important step to generate position profiles for block detectors in the calibration process of
a Positron Emission Tomography (PET) scanner. Automatic peak detection is desirable because
manually selecting each peak is time-consuming, especially for large crystal arrays (e.g. 20x20).
In this project, an edge-based approach is proposed. The proposed algorithm utilizes the
information on the edges and identifies the region that containing a single light spot. Then, the
peaks are detected in each region. A GUI tool is programmed for users to visualize the result and
allow the users to correct each peak manually when there is any error during the automatic peak
search. The algorithm and tool have been tested by 61 sample images. The result is satisfactory
with average accuracy over 95%.
_______________________, Committee Chair
Chung-E Wang, Ph.D.
_______________________
Date
iv
ACKNOWLEDGMENTS
Dedicated to my family and friends who have filled my life with happiness and high expectation
I would like to thank Dr. Chung-E Wang and Dr. Ted Krovetz for their direction, assistance, and
guidance.
I also wish to thank Dr. Yibao Wu and Dr. Yongfeng Yang, whose recommendations and
suggestions have been invaluable for the project
Special thanks should be given to Dr. Simon Cherry who supported me in many ways.
Finally, words alone cannot express the thanks I owe to Dr. Ting Wang, my wife, for her
encouragement and assistance, and the thanks to my son, Yangming Alexander Di, for his endless
love and happiness.
v
TABLE OF CONTENTS
Page
Acknowledgments..................................................................................................................... v
List of Tables ......................................................................................................................... vii
List of Figures ....................................................................................................................... viii
Chapter
1. INTRODUCTION ............................................................................................................... 1
1.1 Positron Emission Tomography (PET) Scanner .................................................... 1
1.2 Calibration and Position Profile ............................................................................ 2
1.3 Identification of Peak Position and Previous Work ............................................... 3
1.4 Aim of This Project................................................................................................ 3
2. EDGES AND INFORMATION ON EDGES .............................................................. 4
2.1 Canny Edge ............................................................................................................. 4
2.2 Edge Information ................................................................................................... 5
2.3 Special Points on an Edge ...................................................................................... 7
2.4 Identify the Missing Special Points ....................................................................... 8
3. ALGORITHM ................................................................................................................ 10
3.1 Generating Canny Edges...................................................................................... 10
3.2 Identifying the Region of Each Light Spot .......................................................... 10
3.3 Searching the Peaks ............................................................................................. 12
4. USER INTERFACE ...................................................................................................... 13
4.1 Coding and Environment ..................................................................................... 13
4.2 How to Use the GUI ............................................................................................ 14
5. TEST RESULT AND SUMMARY ............................................................................ 16
5.1 Test Result ........................................................................................................... 16
5.2 Summary .............................................................................................................. 17
Appendix A. Test Data ........................................................................................................ 18
Appendix B. Algorithm to Sort Peaks ................................................................................ 22
Bibliography ............................................................................................................................ 25
vi
LIST OF TABLES
Page
1. Table 1 Mapping rule for direction transformation …….…………………………….. 6
2. Table 2 Definitions of the special points ………….……………………………….... 7
3. Table 3 Test Data ………….…………………………………………………….... 18
vii
LIST OF FIGURES
Page
1. Figure 1 A detector block, a ring of a PET scanner and a PET acquisition process ……. 1
2. Figure 2 (a) A PET detector ring with point source at the center, (b) the gamma
rays hit a detector block, (c) the flood image of a detector block showing
the histogram of hits………………………….……………………………. 2
3. Figure 3 An image of a light spot………….………………………………………… 4
4. Figure 4 The gradient values of each pixel .…………………………………………. 5
5. Figure 5 (a) A light spot enclosed by an ideal edge, (b) a light spot with a open
edge curve, (c) two light spots enclosed by one edge …………………………6
6. Figure 6 An example that LEFT is missing on the edge ……………………………….8
7. Figure 7 A flood image, some edges and special points………………………………. 9
8. Figure 8 Graphic User Interface ………….………………………………………….13
9. Figure 9 Some flood images and the peaks found after automatic search……………… 16
10. Figure 10 Sorted peaks of a flood image…………………………………………… 22
11. Figure 11 Rotated flood image ……………………………………………………23
12. Figure 12 Relationship of four neighbor peaks ……………………………………24
viii
1
Chapter 1
INTRODUCTION
1.1 Positron Emission Tomography (PET) Scanner
Positron emission tomography (PET) scanners have been developed and applied to clinical
practice for years. The PET technique is a nuclear medicine imaging technique [1]. A radioactive
tracer, which is usually fludeoxyglucose (18F-FDG) [2], is injected into the human body. The
positron emitter 18F in the FDG decays and generates positrons. A positron travels for very short
distance in tissue and annihilates an electron. The annihilation can produce two or more gamma
ray photons. If there are two gamma rays, the energy of each is 511 keV and their directions are
approximately opposite. Because the two gamma rays are emitted at almost 180 degrees to each
other, the annihilation position can be localized on a straight line of coincidence.
As shown in Figure 1, A PET detector block consists of scintillator crystals coupled with
photomultiplier tubes. The adjacent PET detectors form a ring to surround the patient. The
coincidence processor collects the coincidence events and reports to the host computer for image
reconstruction using coincidence statistics.
Figure 1. A detector block, a ring of a PET scanner and a PET acquisition process [1]
2
1.2 Calibration and Position Profile
In order to achieve the accuracy, the PET users have to calibrate the system before using it.
Generating the position profile of each detector block is one of the calibration procedures. As
shown in Figure 2, a radioactive point source is used to generate flood images of the detector
blocks. The gamma rays hit each crystal in the detector blocks. The photomultiplier tubes
transform each photon hit to electric pulse. After these electric signals are processed, the flood
images are generated for the detector blocks.
In a flood image, the brightness of each pixel represents the hit number on that position.
According to a flood image, the position profile will be created so that the hit on each pixel can
be assigned to the corresponding crystal. For example, if the hit falls in area A, it belongs to
crystal 1. The hit on area B will belong to crystal 2.
Normally, the position profiles are in form of lookup tables. As in Figure 2.c, the white lines
divide the flood image into adjacent cells. Each cell is assigned a number corresponding to the
crystal. The pixels in each cell have the same number as the cell number.
(a)
(b)
(c)
Figure 2. (a) A PET detector ring with point source at the center, (b) the gamma rays hit a
detector block, (c) the flood image of a detector block showing the histogram of hits.
3
1.3 Identification of Peak Position and Previous Work
Because the brightness of each pixel represents the hit number, we take the brightest pixel as the
center of a corresponding crystal and call that pixel as a peak. To generate a lookup table for a
detector block, finding the peak positions is the most important step. Automatic peak detection is
desirable because manually selecting each peak is time-consuming, especially for large crystal
arrays (e.g. 20x20). The two main obstacles for automatic peak detection are distortion and peak
fusion.
Several algorithms have been developed to assist users to generate lookup tables. [3] proposed a
neural network based algorithm, [4] used nonrigid registration to a Fourier-based template, [5]
developed a PCA-based algorithm, and [6] and [7] utilized morphological algorithm.
1.4 Aim of This Project
In a flood image, the light spots represent the corresponding crystals. We can first use edges to
enclose each light spot and then find the peak position in each enclosed area. The edge-based
algorithm proposed in this project totally eliminates the interference of crystal distortion and tries
to overcome the difficulties of peak fusion. Compared with [3] and [5], this algorithm does not
need historical data or any training data set. Compared with [4], this algorithm does not need any
readout from PMT single-channels. This algorithm needs only one flood image.
4
Chapter 2
EDGES AND INFORMATION ON EDGES
2.1 Canny Edge
Edges are the points at which the image brightness changes sharply [9]. The edge contains rich
information about the image properties. There are many edge detection techniques. This project
chooses the Canny edge detection method because it may generate more continuous edges than
others may.
The following paragraphs briefly illustrate how the Canny edge detection works. Figure 3 shows
an image of a light spot after Gaussian smooth. The number of each pixel represents the
brightness of that pixel. Figure 4 shows the gradients of the image in Figure 3. The formula to
calculate the gradient of each pixel is as following:
g x, y = (b x-1,y - b x +1,y ) 2 + (b x, y-1 - b x, y+1 ) 2
where gx, y is the gradient of pixel (x, y) and bx, y is the brightness of pixel (x, y).
Figure 3. An image of a light spot
5
Figure 4. The gradient values of each pixel
According to [8], the possible edge points are selected after non-maximum suppression. In short,
the non-maximum suppression method compares the gradient of a point with its neighbors and
selects it as a possible edge point only when it has the maximum local gradient.
Canny edge detection uses two thresholds to filter the candidate edge points. If the gradient of
point A is more than the high threshold, the point A is an edge point. If the gradient of point B is
more than the low threshold and the point B is a neighbor of an edge point, the point B is also an
edge point. In Figure 4, the point in red circle is an edge point.
2.2 Edge Information
The gradient direction of each edge point is very important for identifying the light spot region.
First, we group the gradient directions of the edge points into four categories. Let b x, y stands for
the brightness of pixel (x, y), x, y stands for the degree by which the gradient direction is against
the X-axis.
6
 x, y  arctan(( bx1, y  bx1, y ) /(bx, y1  bx, y1 ))
According to Table 1, we can map a set of edges to a data set containing the edge direction, d x, y.
Figure 5 shows some edges using the edge directions.
Table 1. Mapping rule for direction transformation
x, y
dx, y
-90 < x, y ≤ 0
1
-180 < x, y ≤ -90
2
90 < x, y ≤ 180
3
0 < x, y ≤ 90
4
pixel
(x, y)
The edge direction image shows some patterns. An ideal edge of a light spot is a closed curve
with 1s, 2s, 3s and 4s located clockwise (Figure 5.a). As Figure 5.b shows, some light spots are
not enclosed with closed edges. Even a worse case is like in Figure 5.c. Two light spots are
enclosed with one edge.
(a)
(b)
(c)
Figure 5. (a) A light spot enclosed by an ideal edge, (b) a light spot with a open edge
curve, (c) two light spots enclosed by one edge
7
2.3 Special Points on an Edge
In order to identify the region of a light spot, we define several special points (SP) on an edge
curve. The special points are TOP, BOTTOM, LEFT, RIGHT, HORIZONTAL_BREAK (HB)
and VERTICAL_BREAK (VB). These points are grouped in two categories. TOP, BOTTOM,
LEFT and RIGHT are basic SPs. HORIZONTAL_BREAK and VERTICAL_BREAK are
breaking SPs. The Definitions of these points are in Table 2.
Table 2. Definitions of the special points
dx, y
Conditions
Define dx, y as Special Point
1
if dx+1, y = 2 or dx+1, y-1 = 2 or dx+1, y+1 = 2
TOP
2
if dx, y-1 = 3 or dx-1, y-1 = 3 or dx+1, y-1 = 3
RIGHT
3
if dx-1, y = 4 or dx-1, y-1 = 4 or dx-1, y+1 = 4
BOTTOM
4
if dx, y+1 = 1 or dx-1, y+1 = 1 or dx+1, y+1 = 1
LEFT
1
if dx-1, y = 2 or dx-1, y-1 = 2 or dx-1, y+1 = 2
HORIZONTAL_BREAK
2
if dx, y+1 = 3 or dx-1, y+1 = 3 or dx+1, y+1 = 3
VERTICAL_BREAK
3
if dx+1, y = 4 or dx+1, y-1 = 4 or dx+1, y+1 = 4
HORIZONTAL_BREAK
4
if dx, y-1 = 1 or dx-1, y-1 = 1 or dx+1, y-1 = 1
VERTICAL_BREAK
8
2.4 Identify the Missing Special Points
A region of a light spot has to contain TOP, BOTTOM, LEFT and RIGHT on the edge curve.
However, one or more of those basic SPs may be missing because either the edge curve is not
closed or one curve encloses two or more light spots (e.g. Figure 5.b and Figure 5.c). In this
project, our algorithm discards the edge at which two or more basic SPs are missing. In other
words, if an edge has only one basic SP missing, we will calculate the missing SP; otherwise we
will discard the edge. This strategy is to avoid wrong edges. Meanwhile, it also risks the false
negative.
Calculating the missing SP is based on a symmetric rule. Figure 6 shows an example that one SP
(LEFT) is missing. Let xA stands for the x coordinate of A and yA stands for the y coordinate.
xL = xT - ( xR – xB )
yL = yT - ( yR – yB )
T
L
R
B
Figure 6. An example that LEFT is missing on the edge
9
HORIZONTAL_BREAK may be a substitute of LEFT or RIGHT and VERTICAL_BREAK may
be a substitute of TOP or BOTTOM. For example, in Figure 5.c, the VERTICAL_BREAKs (i.e.
1 and 3 in the circles) may be the LEFT or the RIGHT.
Figure 7 shows some special points. The basic special points are in the circles. The breaking
special points are in the squares. Now, we can separate two fused light spots by these special
points.
Figure 7. A flood image, some edges and special points
10
Chapter 3
ALGORITHM
3.1 Generating Canny Edges
Input a raw flood image (n x n pixels, c x r crystals)
Noise reduction with Gaussian filter
Generating a gradient
Generating a gradient
magnitude map (n x n)
direction map (n x n)
Non-maximum suppression
Tracing edges through the image
and hysteresis thresholding
Generating a map (n x n) in which the edges are noted with
gradient directions (i.e. 1, 2, 3 and 4) and other pixels noted as 0
3.2 Identifying the Region of Each Light Spot
Finding and marking special points on each edge according to the rules
in Table 2
for each edge {
for each basic special points {
case TOP:
find the closest LEFT or HB on its left along the edge;
find the closest RIGHT or HB on its right along the edge;
11
find the closest BOTTOM or VB below it along the edge;
if one of the three points above is missing, calculate it;
case LEFT:
find the closest TOP or VB above it along the edge;
find the closest RIGHT or HB on its right along the edge;
find the closest BOTTOM or VB below it along the edge;
if one of the three points above is missing, calculate it;
case RIGHT:
find the closest LEFT or HB on its left along the edge;
find the closest TOP or VB above it along the edge;
find the closest BOTTOM or VB below it along the edge;
if one of the three points above is missing, calculate it;
case BOTTOM:
find the closest LEFT or HB on its left along the edge;
find the closest RIGHT or HB on its right along the edge;
find the closest TOP or VB above it along the edge;
if one of the three points above is missing, calculate it;
if there are four SPs found (or calculated) {
use the four SPs to form a rectangle / square region;
add the region to the region list;
}
}
}
12
3.3 Searching the Peaks
create a hit-sum list which size is c x r;
create a peak list which size is c x r;
for each region in the region list {
calculate the sum of hit (i.e. brightness or pixel value) in this region;
if the sum is greater than the smallest member in the hit-sum list {
replace the smallest sum with this sum;
update the peak list with the brightest point in this region;
}
}
output the peak list;
13
Chapter 4
USER INTERFACE
4.1 Coding and Environment
The function part of the code is written in standard C. It is easy to be reused. The user interface
part utilizes some library functions provided by the LabWindows compiler. The distribution kit
created by the compiler has been tested on Windows XP.
The user interface takes a 256 x 256 image as input. It can display the peaks and numbers. It can
also output a text file containing peak positions.
Figure 8. Graphic User Interface
14
4.2 How to Use the GUI
o
o
Frame 1

Input the number of crystal row and column.

Click “Load Image” to load a 256 x 256 flood image.

Users can click “Load Peaks” to load previous saved peak marks.
Frame 2

Click “Peaks Detect” to search peaks automatically.

Users can click “Peaks Pinned” to get the peaks on the screen not changed at next
automatic search.

Users can select to show original image or smoothed image. User can also select
whether to display the edges.

Users can change the parameters, such as high threshold or sigma (for Gaussian
filter).

Users can manually correct the peaks: add peak by double clicking the left mouse
key, remove peak by double clicking the right mouse key.
o
Frame 3

Click “Sort Peaks” to sort the peaks automatically (e.g. the sorted numbers on
Figure 8). Be sure that all the peaks have been marked either automatically or
manually before click this button.

Click “Clear Number” to remove the numbers.

The algorithm of sorting peaks is in Appendix B. The algorithm uses one of the
two diagonals to rotate the image. This may not work for some asymmetric
15
images. Therefore, users may manually input the rotating degree to sort the
peaks.
o
Frame 4

Users may click “Save” to save the peak positions in a text file.
16
Chapter 5
TEST RESULT AND SUMMARY
5.1 Test Result
The algorithm has been tested by 61 flood images. Figure 9 shows some typical results. The
missing peaks are marked by blue arrows (with arrowhead to top-right). The incorrect peak found
is marked by a yellow arrow(with arrowhead to top-left). The results show that the highly fused
peaks cause the peak missing and the noises cause the wrong detection. The result data table is in
Appendix A. The average error rate is 4.5%.
Figure 9. Some flood images and the peaks found after automatic search
17
5.2 Summary
The algorithm totally removes the distortion effect. Because the algorithm depends on the quality
of edges detected, the result may be not satisfactory if the image quality is bad and the edge
detector can not generate good edges.
18
APPENDIX A
Test Data
Table 3. Test Data
Total
False
Found
Positive
False
Positive
Negative
Rate
Rate
False
Correct
Peaks
False
Error
Negative
Rate
100
100
95
5
0
5.00%
0.00%
5.00%
100
97
94
3
3
3.09%
3.00%
6.00%
100
100
99
1
0
1.00%
0.00%
1.00%
100
97
97
0
3
0.00%
3.00%
3.00%
100
93
93
0
7
0.00%
7.00%
7.00%
100
98
97
1
2
1.02%
2.00%
3.00%
100
93
93
0
7
0.00%
7.00%
7.00%
100
95
95
0
5
0.00%
5.00%
5.00%
100
100
98
2
0
2.00%
0.00%
2.00%
100
96
96
0
4
0.00%
4.00%
4.00%
100
100
100
0
0
0.00%
0.00%
0.00%
100
100
100
0
0
0.00%
0.00%
0.00%
100
99
99
0
1
0.00%
1.00%
1.00%
100
91
91
0
9
0.00%
9.00%
9.00%
19
Table 3. Test Data (cont.)
False
Total
Found
False
False
Positive
Negative
Correct
Peaks
False
Error
Positive
Negative
Rate
Rate
Rate
100
93
93
0
7
0.00%
7.00%
7.00%
100
96
96
0
4
0.00%
4.00%
4.00%
100
98
98
0
2
0.00%
2.00%
2.00%
100
80
80
0
20
0.00%
20.00%
20.00%
100
97
96
1
3
1.03%
3.00%
4.00%
100
99
98
1
1
1.01%
1.00%
2.00%
100
98
98
0
2
0.00%
2.00%
2.00%
100
97
97
0
3
0.00%
3.00%
3.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
96
96
0
4
0.00%
4.00%
4.00%
100
90
89
1
10
1.11%
10.00%
11.00%
100
91
91
0
9
0.00%
9.00%
9.00%
100
96
92
3
4
3.13%
4.00%
7.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
97
97
0
3
0.00%
3.00%
3.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
98
98
0
2
0.00%
2.00%
2.00%
20
Table 3. Test Data (cont.)
False
Total
Found
False
False
Positive
Negative
Correct
Peaks
False
Error
Positive
Negative
Rate
Rate
Rate
100
100
100
0
0
0.00%
0.00%
0.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
96
96
0
4
0.00%
4.00%
4.00%
100
94
94
0
6
0.00%
6.00%
6.00%
100
93
93
0
7
0.00%
7.00%
7.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
99
99
0
1
0.00%
1.00%
1.00%
100
95
94
1
5
1.05%
5.00%
6.00%
100
94
90
4
6
4.26%
6.00%
10.00%
100
100
100
0
0
0.00%
0.00%
0.00%
100
98
98
0
2
0.00%
2.00%
2.00%
100
99
99
0
1
0.00%
1.00%
1.00%
100
92
92
0
8
0.00%
8.00%
8.00%
100
88
88
0
12
0.00%
12.00%
12.00%
100
100
98
2
0
2.00%
0.00%
2.00%
100
91
91
0
9
0.00%
9.00%
9.00%
100
99
99
0
1
0.00%
1.00%
1.00%
21
Table 3. Test Data (cont.)
False
Total
Found
False
False
Positive
Negative
Correct
Peaks
False
Error
Positive
Negative
Rate
Rate
Rate
81
81
81
0
0
0.00%
0.00%
0.00%
169
169
169
0
0
0.00%
0.00%
0.00%
169
169
169
0
0
0.00%
0.00%
0.00%
169
169
169
0
0
0.00%
0.00%
0.00%
49
49
49
0
0
0.00%
0.00%
0.00%
49
49
49
0
0
0.00%
0.00%
0.00%
49
49
48
1
0
2.04%
0.00%
2.04%
49
49
48
1
0
2.04%
0.00%
2.04%
49
49
48
1
0
2.04%
0.00%
2.04%
49
49
47
2
0
4.08%
0.00%
4.08%
49
49
47
2
0
4.08%
0.00%
4.08%
49
48
45
3
1
6.25%
2.04%
8.16%
49
47
44
3
2
6.38%
4.08%
10.20%
0.86%
3.66%
4.50%
Average
22
APPENDIX B
Algorithm to Sort Peaks
To sort the peaks in order as Figure 10 shows seems to be a non-trivial task. However, we can
make it very easy by rotating the image by 45 degrees (i.e. Figure 11). Now, it is like a diamond.
Figure 10. Sorted peaks of a flood image
23
Figure 11. Rotated flood image
The points A, B, C and D are easy to find. For this example (13 x 13 crystals), A is peak 1, B is
peak 13, C is peak 169 and D is peak 157.
Let O stands for a peak, O.x stands for O’s x coordinate, O.y stands for O’s y coordinate, and O.n
stands for the order number.
24
Start from A;
While A is not B {
For every peaks except A, if O.y ≥ A.y and (O.x – A.x) is minimum {
O.n = A.n + 1;
A = O;
}
}
The pseudo code above will mark the peak number of the upper left side of the diamond (from A
to B). We can mark the other three sides similarly. Once we have marked the four sides of the
diamond, we remove the marked peak points. This step is like to peel an onion. Then, we can do
the same job on the smaller diamond until we get to the center.
This algorithm works for almost all the cases. It will only fail in extreme bad scenario, such as
when B’ takes place B in Figure 12. If the A, B, C and D represent the light spots of four
neighbor crystals, B can rarely fall in the triangle ACD except that the crystals are bad.
Figure 12. Relationship of four neighbor peaks
25
BIBLIOGRAPHY
[1] http://en.wikipedia.org/wiki/Positron_emission_tomography
[2] http://en.wikipedia.org/wiki/Fludeoxyglucose_(18F)
[3] D. Hu, B. Atkins, M. Lenox, B. Castleberry, and S. Siegel, “A neural network based algorithm
for building crystal look-up table of PET block detector,” in Proc. IEEE Nuclear Science
Symp. Conf. Rec., Nov. 2006, vol. 4, pp. 2458–2461.
[4] A. Chaudhari, A. Joshi, S. Bowen, R. Leahy, S. Cherry, and R. Badawi, “Crystal identification
in positron emission tomography using nonrigid registration to a Fourier-based template,”
Phys. Med. Biol., vol. 53, no. 18, pp. 5011–5027, Sep. 2008.
[5] J. Breuer and K. Wienhard, "PCA-Based Algorithm for Generation of Crystal Lookup Tables
for PET Block Detecto," IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 56, NO.
3, JUNE 2009, pp. 602-607.
[6] Albert Mao, Student Volunteer, Imaging Physics Laboratory, Nuclear Medicine Department,
Clinical Center, National Institutes of Health, "Positron Emission Tomograph Detector
Module Calibration Through Morphological Algorithms and Interactive Correction,"
[7] Z. Hu, C. Kao, W. Liu, Y. Dong, Z. Zhang, Q. Xie, C. Chen, "Semi-Automatic Position
Calibration for a Dual-Head Small Animal PET Scanner," 2007 IEEE Nuclear Science
Symposium Conference Record, pp. 1618-1621.
[8] J. Canny, "A computational approach to edge detection", IEEE Trans. Pattern Analysis and
Machine Intelligence, vol 8, pages 679-714, 1986.
[9] http://en.wikipedia.org/wiki/Edge_detection
Download