Investigations of Estimating Densities by Parzen Windows and k

advertisement
25/25
Investigations of Estimating Densities by Parzen Windows
and k-Nearest-Neighbor
J. R. Fairley
Mississippi State University Electrical Engineering Department
Mississippi State, MS 39762
Abstract
The nonparametric methods presented are for estimating density functions when the
underlying density function is unknown by means of Parzen Windows and k-nearestneighbor and also to classify a set of points belonging to one of the three categories in the
training data set given samples of known features for the three classes. The use of the
Parzen Window and k-nearest-neighbor techniques is to demonstrate the use of these
techniques to attain a good representation of the density function in the training data for
which an appropriate classifier can be determined. Each category contains three features
to describe the category, and the use of the Parzen Window and k-nearest-neighbor
techniques to classify a category is exercised for the three categories.
1. Introduction
The Parzen Window approach uses a spherical Gaussian window function, φ(u), and hn =
1.0 for the training data in each of the three categories to classify a category correctly
given a set of known points, and then it is repeated for hn = 0.1. This approach is used to
estimate the density function, and then appropriately classify a category based on the
results of the computed density given a set of points. The window functions do not have
to be limited to the window function used in this work. The window function is just used
to interpolate each sample contributing to the estimate with its distance from x. The
decision regions for a Parzen-window classifier due depend on the choice of the window
function. In this approach, the densities are estimated for each category and a test point is
classified by the label corresponding to the maximum posterior.
For the case when it is not readily apparent what the “best” window function is, the
solution can be to let the cell volume be a function of the training data rather than some
arbitrary function of the samples. In the work presented, the cell about x is grown until
kn samples are captured within a volume Vn which is the nonparametric method knearest-neigbors of x. In this method, if the density is high around or near x, the cell will
be small which leads to good resolution and vice versa.
computed for k = 1, 3, and 5.
Density estimates were
2. Brief Discussion of Theory
In general, the Parzen Window method used to estimate the density is computed by the
following equation given below
p n x  
1 n 1  x  x i
 
n i 1 Vn  h n

,


where the volume Vn is of the form
V n  h nd ,
and d is the dimension of the hypercube and hn is the length of an edge of that hypercube.
The window function φ(u) is a spherical Gaussian window function of the form given
below
 (( x  x i ) / h)  exp[ (x  x i )t (x  x i ) /( 2h 2 )] .
In general, the k-nearest-neighbor method used to estimate the density is computed by
the following equation given below
p n x  
kn /n
V
,
n
Where the volume Vn is of the form
V n  V1 / n ,
where V1 is determined by the nature of the data rather than by some arbitrary choice.
3. Analysis of Results
The training data for the three categories that was used for investigations of the
nonparametric methods, Parzen Windows and k-nearest neighbor, to estimate the
densities are given below in table 1.
Samples
1
2
3
4
5
6
7
8
9
10
ω1
ω2
Features of ω1
x1
x2
x3
0.28
1.31
-6.2
0.07
0.58
-0.78
1.54
2.01
-1.63
-0.44
1.18
-4.32
-0.81
0.21
5.73
1.52
3.16
2.77
2.20
2.42
-0.19
0.91
1.94
6.21
0.65
1.93
4.38
-0.26
0.82
-0.96
Features of ω2
x1
x2
x3
0.011
1.03
-0.21
1.27
1.28
0.08
0.13
3.12
0.16
-0.21
1.23
-0.11
-2.18
1.39
-0.19
0.34
1.96
-0.16
-1.38
0.94
0.45
-0.12
0.82
0.17
-1.44
2.31
0.14
0.26
1.94
0.08
ω3
x1
1.36
1.41
1.22
2.46
0.68
2.51
0.60
0.64
0.85
0.60
Features of ω3
x2
x3
2.17
0.14
1.45
-0.38
0.99
0.69
2.19
1.31
0.79
0.87
3.22
1.35
2.44
0.92
0.13
0.97
0.58
0.99
0.51
0.88
Table 1. Training Data
The test points to classify using the Parzen Window method are shown below in table 2.
Features
Test Point 1
Test Point 2
Test Point 3
x1
0.50
0.31
-0.3
x2
1.0
1.51
0.44
x3
0.0
-0.50
-0.1
Table 2. Test Points used in the Parzen Window Classification Method
Table 3 below shows the classification of an arbitrary test point x based on the Parzen
window estimates for hn = 1.
Test Point
p(x) of ω1
p(x) of ω2
p(x) of ω3
1
2
3
0.1259
0.1534
0.1399
0.4711
0.4828
0.3783
0.3980
0.2260
0.1823
Table 3. Results from Classification of Test Points by Parzen Window Estimates
Table 4 below shows the classification of an arbitrary test point x based on the Parzen
window estimates for hn = 0.1.
Test Point
p(x) of ω1
p(x) of ω2
p(x) of ω3
1
2
3
8.775e-20
2.871e-20
3.637e-12
6.769e-5
1.200e-5
3.783e-4
8.000e-17
2.159e-25
1.064e-39
Table 4. Results from Classification of Test Points by Parzen Window Estimates
Figure 1 below shows a plot of the density by the k-nearest neighbor method for k = 1
using the x1 values of category 3.
Figure 1. Plot of Estimated Density for k = 1 using k-Nearest-Neighbor Method
Figure 2 below shows a plot of the density by the k-nearest neighbor method for k = 3
using the x1 values of category 3.
Figure 2. Plot of Estimated Density for k = 3 using k-Nearest-Neighbor Method
Figure 3 below shows a plot of the density by the k-nearest neighbor method for k = 5
using the x1 values of category 3.
Figure 3. Plot of Estimated Density for k = 5 using k-Nearest-Neighbor Method
Figure 4 below shows a plot of the density by the k-nearest neighbor method for k = 1
using the x1 and x2 values of category 2.
Figure 4. Plot of Estimated Density for k = 1 using k-Nearest-Neighbor Method for 2-Dimensions
Figure 5 below shows a plot of the density by the k-nearest neighbor method for k = 3
using the x1 and x2 values of category 2.
Figure 5. Plot of Estimated Density for k = 3 using k-Nearest-Neighbor Method for 2-Dimensions
Figure 6 below shows a plot of the density by the k-nearest neighbor method for k = 5
using the x1 and x2 values of category 2.
Figure 6. Plot of Estimated Density for k = 5 using k-Nearest-Neighbor Method for 2-Dimensions
The test points to classify the three-dimensional data from three categories using the knearest-neighbor method are shown below in table 5.
Features
Test Point 1
Test Point 2
Test Point 3
x1
-0.41
0.14
-0.81
x2
0.82
0.72
0.61
x3
0.88
4.1
-0.38
Table 5. Test Points used in the k-Nearest-Neighbor Classification Method
Table 6 below shows the classification of an arbitrary test point x based on the k-nearestneighbor estimates for k = 1.
Test Point
p(x) of ω1
p(x) of ω2
p(x) of ω3
1
2
3
0.0142
0.0312
0.1337
0.1673
0.0012
0.1022
0.0582
0.0023
0.0104
Table 6. Results from Classification of Test Points by k-Nearest-Neighbor Estimates for k = 1
Table 7 below shows the classification of an arbitrary test point x based on the k-nearestneighbor estimates for k = 3.
Test Point
p(x) of ω1
p(x) of ω2
p(x) of ω3
1
2
3
0.0065
0.0136
0.0083
0.1749
0.0031
0.2747
0.1133
0.0065
0.0267
Table 7. Results from Classification of Test Points by k-Nearest-Neighbor Estimates for k = 3
Table 8 below shows the classification of an arbitrary test point x based on the k-nearestneighbor estimates for k = 5.
Test Point
p(x) of ω1
p(x) of ω2
p(x) of ω3
1
2
3
0.0083
0.0032
0.0059
0.1052
0.0049
0.0943
0.0841
0.0082
0.0300
Table 8. Results from Classification of Test Points by k-Nearest-Neighbor Estimates for k = 5
4. Conclusion
In conclusion, for the Parzen Window method, category 2 was classified solely for all
three points using the three-dimensional data from the three categories for hn = 1 and 0.1.
This is likely due to the random selection of points to classify as well as the use of the
spherical Gaussian window function φ(u). If an alternative window function was used,
or a different set of points to classify were used, the results most likely would have
different. Overall, the Parzen Window method is an effective way to estimate the density
function when the form of the function is unknown. It does pose some issues if the
choice for the “best” window function is unknown or the sample size of the data set is
limited or small.
The use of the k-nearest-neighbor method classified categories in a seemingly random
nature for k = 1, 3, and 5. This erratic behavior is most likely due to the size of the
sample data set as well as the value for k since as k and n go to infinity this method will
converge on the true density function of the data set. This technique did show its
usefulness in the case data set where the form of the density function is unknown or not
well mathematically understood in regards to one of the more general density forms (e.g.
normal, uniform, exponential, etc.), and when the “best” window function for the Parzen
Window method is difficult to ascertain.
5. MatLab Computer Code for the Parzen Window and k-NearestNeighbor
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Author:
Josh Fairley
% Course:
ECE 8443 - Pattern Recognition
% Assignment: Computer Exercise # 3 - Problems 2 and 3, Parzen Windows
%
and k-NN
% Date:
April 20, 2006
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
clear all;
close all;
clc;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% X1w1 - X1 feature of W1
% X2w1 - X2 feature of W1
% X3w1 - X3 feature of W1
%
% X1w2 - X1 feature of W2
% X2w2 - X2 feature of W2
% X3w2 - X3 feature of W2
%
% X1w3 - X1 feature of W3
% X2w3 - X2 feature of W3
% X3w3 - X3 feature of W3
%
% X_W1 - Matrix Construction for three-dimensional case of W1
%
% X_W2 - Matrix Construction for three-dimensional case of W2
%
% X_W3 - Matrix Construction for three-dimensional case of W3
%
% XTP1 - First Test Point to Classify
% XTP2 - Second Test Point to Classify
% XTP3 - Third Test Point to Classify
X1w1 = [0.28 0.07 1.54 -0.44 -0.81 1.52 2.20 0.91 0.65 -0.26]';
X2w1 = [1.31 0.58 2.01 1.18 0.21 3.16 2.42 1.94 1.93 0.82]';
X3w1 = [-6.2 -0.78 -1.63 -4.32 5.73 2.77 -0.19 6.21 4.38 -0.96]';
X1w2 = [0.011 1.27 0.13 -0.21 -2.18 0.34 -1.38 -0.12 -1.44 0.26]';
X2w2 = [1.03 1.28 3.12 1.23 1.39 1.96 0.94 0.82 2.31 1.94]';
X3w2 = [-0.21 0.08 0.16 -0.11 -0.19 -0.16 0.45 0.17 0.14 0.08]';
X1w3 = [1.36 1.41 1.22 2.46 0.68 2.51 0.60 0.64 0.85 0.66]';
X2w3 = [2.17 1.45 0.99 2.19 0.79 3.22 2.44 0.13 0.58 0.51]';
X3w3 = [0.14 -0.38 0.69 1.31 0.87 1.35 0.92 0.97 0.99 0.88]';
X_W1 = [X1w1 X2w1 X3w1];
X_W2 = [X1w2 X2w2 X3w2];
X_W3 = [X1w3 X2w3 X3w3];
XTP1a = [0.50 1.0 0]'; XTP2a = [0.31 1.51 -0.5]'; XTP3a = [-0.3 0.44 -0.1]';
XTPa = [XTP1a XTP2a XTP3a];
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Parzen-window approach to estimate densities - Problem 2, Chapter 4
n = 3;
s = 10;
% Number of test points
% Number of samples for each feature
hn = 0.1; % Length of edge for the hypercube
Vn = hn^3; % Volume of the hypercube
for i = 1:s
phi1(i) = (1/Vn)*exp(-((XTPa(:,1) - X_W1(i,:)')'*(XTPa(:,1) - X_W1(i,:)'))/(2*hn^2));
phi2(i) = (1/Vn)*exp(-((XTPa(:,2) - X_W1(i,:)')'*(XTPa(:,2) - X_W1(i,:)'))/(2*hn^2));
phi3(i) = (1/Vn)*exp(-((XTPa(:,3) - X_W1(i,:)')'*(XTPa(:,3) - X_W1(i,:)'))/(2*hn^2));
end
pn1a = (1/s)*sum(phi1)
pn1b = (1/s)*sum(phi2)
pn1c = (1/s)*sum(phi3)
for i = 1:s
phi1(i) = (1/Vn)*exp(-((XTPa(:,1) - X_W2(i,:)')'*(XTPa(:,1) - X_W2(i,:)'))/(2*hn^2));
phi2(i) = (1/Vn)*exp(-((XTPa(:,2) - X_W2(i,:)')'*(XTPa(:,2) - X_W2(i,:)'))/(2*hn^2));
phi3(i) = (1/Vn)*exp(-((XTPa(:,3) - X_W2(i,:)')'*(XTPa(:,3) - X_W2(i,:)'))/(2*hn^2));
end
pn2a = (1/s)*sum(phi1)
pn2b = (1/s)*sum(phi2)
pn2c = (1/s)*sum(phi3)
for i = 1:s
phi1(i) = (1/Vn)*exp(-((XTPa(:,1) - X_W3(i,:)')'*(XTPa(:,1) - X_W3(i,:)'))/(2*hn^2));
phi2(i) = (1/Vn)*exp(-((XTPa(:,2) - X_W3(i,:)')'*(XTPa(:,2) - X_W3(i,:)'))/(2*hn^2));
phi3(i) = (1/Vn)*exp(-((XTPa(:,3) - X_W3(i,:)')'*(XTPa(:,3) - X_W3(i,:)'))/(2*hn^2));
end
pn3a = (1/s)*sum(phi1)
pn3b = (1/s)*sum(phi2)
pn3c = (1/s)*sum(phi3)
% End of Parzen-window approach to estimate densities
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% k-nearest-neighbor density estimations - Problem 2, Chapter 4
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Part A - Uses X1 feature of Category 3 for k = 1, 3, and 5
% a = min(X1w3) - 1;
% b = max(X1w3) + 1;
step_size = 0.1;
% Start of boundary for density estimation
% End of boundary for density estimation
% Step size for going from a to b
% number_of_iterations = round((b - a)/step_size);
k1 = 1; k2 = 3; k3 = 5;
% x = a;
Y = zeros(10,1);
% for i = 1:number_of_iterations % I am experimenting here in this step
% for j = 1:s
%
Distance(j) = abs(x - X1w3(j)); % Computing Distance
% end
% Distance = sort(Distance);
% Radius1 = Distance(k1);
% Radius2 = Distance(k2);
% Radius3 = Distance(k3);
% V1a = pi*Radius1^2;
% V1b = pi*Radius2^2;
% V1c = pi*Radius3^2;
% Vn1 = V1a/sqrt(s);
% Vn2 = V1b/sqrt(s);
% Vn3 = V1c/sqrt(s);
% Px1(i) = (k1/s)/Vn1;
% Px3(i) = (k2/s)/Vn2;
% Px5(i) = (k3/s)/Vn3;
% X(i) = x;
% x = x + step_size;
% end
%
% figure;
% plot(X, Px1, 'b', X, Px3, 'r', X, Px5, 'c', X1w3, Y, 'k*');
% xlabel('X');
% ylabel('P(x)');
% title('Estimated Density Function P(x) vs. Training Data X');
% legend('k = 1', 'k = 3', 'k = 5', 'X1 of W3');
% grid on;
% End of experimentation
for i = 1:s
for j = 1:s
D = abs(X1w3(i) - X1w3(j));
if D > 0
Distance(j) = D;
end
end
Distance = sort(Distance);
Radius1 = Distance(k1);
Radius2 = Distance(k2);
Radius3 = Distance(k3);
V1a = pi*Radius1^2;
V1b = pi*Radius2^2;
V1c = pi*Radius3^2;
Vn1 = V1a/sqrt(s);
Vn2 = V1b/sqrt(s);
Vn3 = V1c/sqrt(s);
P1(i) = (k1/s)/Vn1;
P3(i) = (k2/s)/Vn2;
P5(i) = (k3/s)/Vn3;
end
% Computing Distance
P1 = P1'; P3 = P3'; P5 = P5';
Data = sortrows(cat(2,X1w3,P1,P3,P5), 1);
x = Data(:,1);
Px1 = Data(:,2);
Px3 = Data(:,3);
Px5 = Data(:,4);
figure;
plot(x, Px1, 'b', X1w3, Y, 'k*');
xlabel('X');
ylabel('P(x)');
title('Estimated Density Function P(x) vs. Training Data X');
legend('k = 1', 'X1 of W3');
grid on;
figure;
plot(x, Px3, 'r', X1w3, Y, 'k*');
xlabel('X');
ylabel('P(x)');
title('Estimated Density Function P(x) vs. Training Data X');
legend('k = 3', 'X1 of W3');
grid on;
figure;
plot(x, Px5, 'c', X1w3, Y, 'k*');
xlabel('X');
ylabel('P(x)');
title('Estimated Density Function P(x) vs. Training Data X');
legend('k = 5', 'X1 of W3');
grid on;
% End of Part A
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Part B - Uses X1 and X2 features of W2 for k = 1, 3, and 5
x = [X1w2 X2w2];
a = min(x(:,1)) - 1;
b = max(x(:,1)) + 1;
c = min(x(:,2)) - 1;
d = max(x(:,2)) + 1;
number_of_iterations1 = round((b - a)/step_size)+ 1;
number_of_iterations2 = round((d - c)/step_size)+ 1;
for i = 1:number_of_iterations2
for j = 1:number_of_iterations1
X1_Position(i,j) = a + (j - 1) * step_size;
% X1 position
X2_Position(i,j) = c + (i - 1) * step_size;
% X2 position
for k = 1:s
Distance(k)= sqrt((X2_Position(i,j) - x(k,2))^2 + (X1_Position(i,j) - x(k,1))^2); % Computing Distance
end
Distance = sort(Distance);
Radius1 = Distance(k1);
Radius2 = Distance(k2);
Radius3 = Distance(k3);
V1a = pi*Radius1^2;
V1b = pi*Radius2^2;
V1c = pi*Radius3^2;
Vn1 = V1a/sqrt(s);
Vn2 = V1b/sqrt(s);
Vn3 = V1c/sqrt(s);
Px(i,j,1) = (k1/s)/Vn1;
Px(i,j,2) = (k2/s)/Vn2;
Px(i,j,3) = (k3/s)/Vn3;
end
end
figure;
mesh(X1_Position, X2_Position, Px(:,:,1));
xlabel('X1');
ylabel('X2');
zlabel('P(x)');
title('k-Nearest-Neighbor Density Estimation - k = 1');
axis tight;
figure;
mesh(X1_Position, X2_Position, Px(:,:,2));
xlabel('X1');
ylabel('X2');
zlabel('P(x)');
title('k-Nearest-Neighbor Density Estimation - k = 3');
axis tight;
figure;
mesh(X1_Position, X2_Position, Px(:,:,3));
xlabel('X1');
ylabel('X2');
zlabel('P(x)');
title('k-Nearest-Neighbor Density Estimation - k = 5');
axis tight;
% End of Part B
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Part C - Uses X1, X2, and X3 features of all categories for k = 1, 3,
%
and 5 and estimates the densities for three points.
XTP1b = [-0.41 0.82 0.88]'; XTP2b = [0.14 0.72 4.10]'; XTP3b = [-0.81 0.61 -0.38]';
for i = 1:s
Distance1(i)= sqrt((XTP1b(1) - X_W1(i,1))^2 + (XTP1b(2) - X_W1(i,2))^2 + (XTP1b(3) - X_W1(i,3))^2); % Computing
Distance
Distance2(i)= sqrt((XTP1b(1) - X_W2(i,1))^2 + (XTP1b(2) - X_W2(i,2))^2 + (XTP1b(3) - X_W2(i,3))^2); % Computing
Distance
Distance3(i)= sqrt((XTP1b(1) - X_W3(i,1))^2 + (XTP1b(2) - X_W3(i,2))^2 + (XTP1b(3) - X_W3(i,3))^2); % Computing
Distance
Distance4(i)= sqrt((XTP2b(1) - X_W1(i,1))^2 + (XTP2b(2) - X_W1(i,2))^2 + (XTP2b(3) - X_W1(i,3))^2); % Computing
Distance
Distance5(i)= sqrt((XTP2b(1) - X_W2(i,1))^2 + (XTP2b(2) - X_W2(i,2))^2 + (XTP2b(3) - X_W2(i,3))^2); % Computing
Distance
Distance6(i)= sqrt((XTP2b(1) - X_W3(i,1))^2 + (XTP2b(2) - X_W3(i,2))^2 + (XTP2b(3) - X_W3(i,3))^2); % Computing
Distance
Distance7(i)= sqrt((XTP3b(1) - X_W1(i,1))^2 + (XTP3b(2) - X_W1(i,2))^2 + (XTP3b(3) - X_W1(i,3))^2); % Computing
Distance
Distance8(i)= sqrt((XTP3b(1) - X_W2(i,1))^2 + (XTP3b(2) - X_W2(i,2))^2 + (XTP3b(3) - X_W2(i,3))^2); % Computing
Distance
Distance9(i)= sqrt((XTP3b(1) - X_W3(i,1))^2 + (XTP3b(2) - X_W3(i,2))^2 + (XTP3b(3) - X_W3(i,3))^2); % Computing
Distance
end
Distance1 = sort(Distance1);
Distance2 = sort(Distance2);
Distance3 = sort(Distance3);
Distance4 = sort(Distance4);
Distance5 = sort(Distance5);
Distance6 = sort(Distance6);
Distance7 = sort(Distance7);
Distance8 = sort(Distance8);
Distance9 = sort(Distance9);
R1 = Distance1(k1);
R2 = Distance2(k1);
R3 = Distance3(k1);
R4 = Distance4(k1);
R5 = Distance5(k1);
R6 = Distance6(k1);
R7 = Distance7(k1);
R8 = Distance8(k1);
R9 = Distance9(k1);
R10 = Distance1(k2);
R11 = Distance2(k2);
R12 = Distance3(k2);
R13 = Distance4(k2);
R14 = Distance5(k2);
R15 = Distance6(k2);
R16 = Distance7(k2);
R17 = Distance8(k2);
R18 = Distance9(k2);
R19 = Distance1(k3);
R20 = Distance2(k3);
R21 = Distance3(k3);
R22 = Distance4(k3);
R23 = Distance5(k3);
R24 = Distance6(k3);
R25 = Distance7(k3);
R26 = Distance8(k3);
R27 = Distance9(k3);
Vn1 = ((4/3)*pi*R1^3)/sqrt(s);
Vn2 = ((4/3)*pi*R2^3)/sqrt(s);
Vn3 = ((4/3)*pi*R3^3)/sqrt(s);
Vn4 = ((4/3)*pi*R4^3)/sqrt(s);
Vn5 = ((4/3)*pi*R5^3)/sqrt(s);
Vn6 = ((4/3)*pi*R6^3)/sqrt(s);
Vn7 = ((4/3)*pi*R7^3)/sqrt(s);
Vn8 = ((4/3)*pi*R8^3)/sqrt(s);
Vn9 = ((4/3)*pi*R9^3)/sqrt(s);
Vn10 = ((4/3)*pi*R10^3)/sqrt(s);
Vn11 = ((4/3)*pi*R11^3)/sqrt(s);
Vn12 = ((4/3)*pi*R12^3)/sqrt(s);
Vn13 = ((4/3)*pi*R13^3)/sqrt(s);
Vn14 = ((4/3)*pi*R14^3)/sqrt(s);
Vn15 = ((4/3)*pi*R15^3)/sqrt(s);
Vn16 = ((4/3)*pi*R16^3)/sqrt(s);
Vn17 = ((4/3)*pi*R17^3)/sqrt(s);
Vn18 = ((4/3)*pi*R18^3)/sqrt(s);
Vn19 = ((4/3)*pi*R19^3)/sqrt(s);
Vn20 = ((4/3)*pi*R20^3)/sqrt(s);
Vn21 = ((4/3)*pi*R21^3)/sqrt(s);
Vn22 = ((4/3)*pi*R22^3)/sqrt(s);
Vn23 = ((4/3)*pi*R23^3)/sqrt(s);
Vn24 = ((4/3)*pi*R24^3)/sqrt(s);
Vn25 = ((4/3)*pi*R25^3)/sqrt(s);
Vn26 = ((4/3)*pi*R26^3)/sqrt(s);
Vn27 = ((4/3)*pi*R27^3)/sqrt(s);
P1_1a = (k1/s)/Vn1;
P1_2a = (k1/s)/Vn2;
P1_3a = (k1/s)/Vn3;
P1_1b = (k1/s)/Vn4;
P1_2b = (k1/s)/Vn5;
P1_3b = (k1/s)/Vn6;
P1_1c = (k1/s)/Vn7;
P1_2c = (k1/s)/Vn8;
P1_3c = (k1/s)/Vn9;
P3_1a = (k2/s)/Vn10;
P3_2a = (k2/s)/Vn11;
P3_3a = (k2/s)/Vn12;
P3_1b = (k2/s)/Vn13;
P3_2b = (k2/s)/Vn14;
P3_3b = (k2/s)/Vn15;
P3_1c = (k2/s)/Vn16;
P3_2c = (k2/s)/Vn17;
P3_3c = (k2/s)/Vn18;
P5_1a = (k3/s)/Vn19;
P5_2a = (k3/s)/Vn20;
P5_3a = (k3/s)/Vn21;
P5_1b = (k3/s)/Vn22;
P5_2b = (k3/s)/Vn23;
P5_3b = (k3/s)/Vn24;
P5_1c = (k3/s)/Vn25;
P5_2c = (k3/s)/Vn26;
P5_3c = (k3/s)/Vn27;
P1a = cat(2, P1_1a, P1_2a, P1_3a);
P1b = cat(2, P1_1b, P1_2b, P1_3b);
P1c = cat(2, P1_1c, P1_2c, P1_3c);
P1 = cat(1,P1a,P1b,P1c)
P3a = cat(2, P3_1a, P3_2a, P3_3a);
P3b = cat(2, P3_1b, P3_2b, P3_3b);
P3c = cat(2, P3_1c, P3_2c, P3_3c);
P3 = cat(1,P3a,P3b,P3c)
P5a = cat(2, P5_1a, P5_2a, P5_3a);
P5b = cat(2, P5_1b, P5_2b, P5_3b);
P5c = cat(2, P5_1c, P5_2c, P5_3c);
P3 = cat(1,P5a,P5b,P5c)
% End of Part C
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% End of k-nearest-neighbor density estimations
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Download